Dec  5 06:43:15 np0005546954 kernel: Linux version 5.14.0-645.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-68.el9) #1 SMP PREEMPT_DYNAMIC Fri Nov 28 14:01:17 UTC 2025
Dec  5 06:43:15 np0005546954 kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Dec  5 06:43:15 np0005546954 kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64 root=UUID=fcf6b761-831a-48a7-9f5f-068b5063763f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec  5 06:43:15 np0005546954 kernel: BIOS-provided physical RAM map:
Dec  5 06:43:15 np0005546954 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Dec  5 06:43:15 np0005546954 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Dec  5 06:43:15 np0005546954 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Dec  5 06:43:15 np0005546954 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Dec  5 06:43:15 np0005546954 kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Dec  5 06:43:15 np0005546954 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Dec  5 06:43:15 np0005546954 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Dec  5 06:43:15 np0005546954 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Dec  5 06:43:15 np0005546954 kernel: NX (Execute Disable) protection: active
Dec  5 06:43:15 np0005546954 kernel: APIC: Static calls initialized
Dec  5 06:43:15 np0005546954 kernel: SMBIOS 2.8 present.
Dec  5 06:43:15 np0005546954 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Dec  5 06:43:15 np0005546954 kernel: Hypervisor detected: KVM
Dec  5 06:43:15 np0005546954 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Dec  5 06:43:15 np0005546954 kernel: kvm-clock: using sched offset of 3130227504 cycles
Dec  5 06:43:15 np0005546954 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Dec  5 06:43:15 np0005546954 kernel: tsc: Detected 2800.000 MHz processor
Dec  5 06:43:15 np0005546954 kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Dec  5 06:43:15 np0005546954 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Dec  5 06:43:15 np0005546954 kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Dec  5 06:43:15 np0005546954 kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Dec  5 06:43:15 np0005546954 kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Dec  5 06:43:15 np0005546954 kernel: Using GB pages for direct mapping
Dec  5 06:43:15 np0005546954 kernel: RAMDISK: [mem 0x2d472000-0x32a30fff]
Dec  5 06:43:15 np0005546954 kernel: ACPI: Early table checksum verification disabled
Dec  5 06:43:15 np0005546954 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Dec  5 06:43:15 np0005546954 kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec  5 06:43:15 np0005546954 kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec  5 06:43:15 np0005546954 kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec  5 06:43:15 np0005546954 kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Dec  5 06:43:15 np0005546954 kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec  5 06:43:15 np0005546954 kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec  5 06:43:15 np0005546954 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Dec  5 06:43:15 np0005546954 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Dec  5 06:43:15 np0005546954 kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Dec  5 06:43:15 np0005546954 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Dec  5 06:43:15 np0005546954 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Dec  5 06:43:15 np0005546954 kernel: No NUMA configuration found
Dec  5 06:43:15 np0005546954 kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Dec  5 06:43:15 np0005546954 kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Dec  5 06:43:15 np0005546954 kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Dec  5 06:43:15 np0005546954 kernel: Zone ranges:
Dec  5 06:43:15 np0005546954 kernel:  DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Dec  5 06:43:15 np0005546954 kernel:  DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Dec  5 06:43:15 np0005546954 kernel:  Normal   [mem 0x0000000100000000-0x000000023fffffff]
Dec  5 06:43:15 np0005546954 kernel:  Device   empty
Dec  5 06:43:15 np0005546954 kernel: Movable zone start for each node
Dec  5 06:43:15 np0005546954 kernel: Early memory node ranges
Dec  5 06:43:15 np0005546954 kernel:  node   0: [mem 0x0000000000001000-0x000000000009efff]
Dec  5 06:43:15 np0005546954 kernel:  node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Dec  5 06:43:15 np0005546954 kernel:  node   0: [mem 0x0000000100000000-0x000000023fffffff]
Dec  5 06:43:15 np0005546954 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Dec  5 06:43:15 np0005546954 kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Dec  5 06:43:15 np0005546954 kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Dec  5 06:43:15 np0005546954 kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Dec  5 06:43:15 np0005546954 kernel: ACPI: PM-Timer IO Port: 0x608
Dec  5 06:43:15 np0005546954 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Dec  5 06:43:15 np0005546954 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Dec  5 06:43:15 np0005546954 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Dec  5 06:43:15 np0005546954 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Dec  5 06:43:15 np0005546954 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Dec  5 06:43:15 np0005546954 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Dec  5 06:43:15 np0005546954 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Dec  5 06:43:15 np0005546954 kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Dec  5 06:43:15 np0005546954 kernel: TSC deadline timer available
Dec  5 06:43:15 np0005546954 kernel: CPU topo: Max. logical packages:   8
Dec  5 06:43:15 np0005546954 kernel: CPU topo: Max. logical dies:       8
Dec  5 06:43:15 np0005546954 kernel: CPU topo: Max. dies per package:   1
Dec  5 06:43:15 np0005546954 kernel: CPU topo: Max. threads per core:   1
Dec  5 06:43:15 np0005546954 kernel: CPU topo: Num. cores per package:     1
Dec  5 06:43:15 np0005546954 kernel: CPU topo: Num. threads per package:   1
Dec  5 06:43:15 np0005546954 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Dec  5 06:43:15 np0005546954 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Dec  5 06:43:15 np0005546954 kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Dec  5 06:43:15 np0005546954 kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Dec  5 06:43:15 np0005546954 kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Dec  5 06:43:15 np0005546954 kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Dec  5 06:43:15 np0005546954 kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Dec  5 06:43:15 np0005546954 kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Dec  5 06:43:15 np0005546954 kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Dec  5 06:43:15 np0005546954 kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Dec  5 06:43:15 np0005546954 kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Dec  5 06:43:15 np0005546954 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Dec  5 06:43:15 np0005546954 kernel: Booting paravirtualized kernel on KVM
Dec  5 06:43:15 np0005546954 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Dec  5 06:43:15 np0005546954 kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Dec  5 06:43:15 np0005546954 kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Dec  5 06:43:15 np0005546954 kernel: kvm-guest: PV spinlocks disabled, no host support
Dec  5 06:43:15 np0005546954 kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64 root=UUID=fcf6b761-831a-48a7-9f5f-068b5063763f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec  5 06:43:15 np0005546954 kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64", will be passed to user space.
Dec  5 06:43:15 np0005546954 kernel: random: crng init done
Dec  5 06:43:15 np0005546954 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Dec  5 06:43:15 np0005546954 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Dec  5 06:43:15 np0005546954 kernel: Fallback order for Node 0: 0 
Dec  5 06:43:15 np0005546954 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Dec  5 06:43:15 np0005546954 kernel: Policy zone: Normal
Dec  5 06:43:15 np0005546954 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Dec  5 06:43:15 np0005546954 kernel: software IO TLB: area num 8.
Dec  5 06:43:15 np0005546954 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Dec  5 06:43:15 np0005546954 kernel: ftrace: allocating 49335 entries in 193 pages
Dec  5 06:43:15 np0005546954 kernel: ftrace: allocated 193 pages with 3 groups
Dec  5 06:43:15 np0005546954 kernel: Dynamic Preempt: voluntary
Dec  5 06:43:15 np0005546954 kernel: rcu: Preemptible hierarchical RCU implementation.
Dec  5 06:43:15 np0005546954 kernel: rcu: #011RCU event tracing is enabled.
Dec  5 06:43:15 np0005546954 kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Dec  5 06:43:15 np0005546954 kernel: #011Trampoline variant of Tasks RCU enabled.
Dec  5 06:43:15 np0005546954 kernel: #011Rude variant of Tasks RCU enabled.
Dec  5 06:43:15 np0005546954 kernel: #011Tracing variant of Tasks RCU enabled.
Dec  5 06:43:15 np0005546954 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Dec  5 06:43:15 np0005546954 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Dec  5 06:43:15 np0005546954 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec  5 06:43:15 np0005546954 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec  5 06:43:15 np0005546954 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec  5 06:43:15 np0005546954 kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Dec  5 06:43:15 np0005546954 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Dec  5 06:43:15 np0005546954 kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Dec  5 06:43:15 np0005546954 kernel: Console: colour VGA+ 80x25
Dec  5 06:43:15 np0005546954 kernel: printk: console [ttyS0] enabled
Dec  5 06:43:15 np0005546954 kernel: ACPI: Core revision 20230331
Dec  5 06:43:15 np0005546954 kernel: APIC: Switch to symmetric I/O mode setup
Dec  5 06:43:15 np0005546954 kernel: x2apic enabled
Dec  5 06:43:15 np0005546954 kernel: APIC: Switched APIC routing to: physical x2apic
Dec  5 06:43:15 np0005546954 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Dec  5 06:43:15 np0005546954 kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Dec  5 06:43:15 np0005546954 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Dec  5 06:43:15 np0005546954 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Dec  5 06:43:15 np0005546954 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Dec  5 06:43:15 np0005546954 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Dec  5 06:43:15 np0005546954 kernel: Spectre V2 : Mitigation: Retpolines
Dec  5 06:43:15 np0005546954 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Dec  5 06:43:15 np0005546954 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Dec  5 06:43:15 np0005546954 kernel: RETBleed: Mitigation: untrained return thunk
Dec  5 06:43:15 np0005546954 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Dec  5 06:43:15 np0005546954 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Dec  5 06:43:15 np0005546954 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Dec  5 06:43:15 np0005546954 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Dec  5 06:43:15 np0005546954 kernel: x86/bugs: return thunk changed
Dec  5 06:43:15 np0005546954 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Dec  5 06:43:15 np0005546954 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Dec  5 06:43:15 np0005546954 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Dec  5 06:43:15 np0005546954 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Dec  5 06:43:15 np0005546954 kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Dec  5 06:43:15 np0005546954 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Dec  5 06:43:15 np0005546954 kernel: Freeing SMP alternatives memory: 40K
Dec  5 06:43:15 np0005546954 kernel: pid_max: default: 32768 minimum: 301
Dec  5 06:43:15 np0005546954 kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Dec  5 06:43:15 np0005546954 kernel: landlock: Up and running.
Dec  5 06:43:15 np0005546954 kernel: Yama: becoming mindful.
Dec  5 06:43:15 np0005546954 kernel: SELinux:  Initializing.
Dec  5 06:43:15 np0005546954 kernel: LSM support for eBPF active
Dec  5 06:43:15 np0005546954 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Dec  5 06:43:15 np0005546954 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Dec  5 06:43:15 np0005546954 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Dec  5 06:43:15 np0005546954 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Dec  5 06:43:15 np0005546954 kernel: ... version:                0
Dec  5 06:43:15 np0005546954 kernel: ... bit width:              48
Dec  5 06:43:15 np0005546954 kernel: ... generic registers:      6
Dec  5 06:43:15 np0005546954 kernel: ... value mask:             0000ffffffffffff
Dec  5 06:43:15 np0005546954 kernel: ... max period:             00007fffffffffff
Dec  5 06:43:15 np0005546954 kernel: ... fixed-purpose events:   0
Dec  5 06:43:15 np0005546954 kernel: ... event mask:             000000000000003f
Dec  5 06:43:15 np0005546954 kernel: signal: max sigframe size: 1776
Dec  5 06:43:15 np0005546954 kernel: rcu: Hierarchical SRCU implementation.
Dec  5 06:43:15 np0005546954 kernel: rcu: #011Max phase no-delay instances is 400.
Dec  5 06:43:15 np0005546954 kernel: smp: Bringing up secondary CPUs ...
Dec  5 06:43:15 np0005546954 kernel: smpboot: x86: Booting SMP configuration:
Dec  5 06:43:15 np0005546954 kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Dec  5 06:43:15 np0005546954 kernel: smp: Brought up 1 node, 8 CPUs
Dec  5 06:43:15 np0005546954 kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Dec  5 06:43:15 np0005546954 kernel: node 0 deferred pages initialised in 10ms
Dec  5 06:43:15 np0005546954 kernel: Memory: 7763988K/8388068K available (16384K kernel code, 5795K rwdata, 13908K rodata, 4196K init, 7156K bss, 618204K reserved, 0K cma-reserved)
Dec  5 06:43:15 np0005546954 kernel: devtmpfs: initialized
Dec  5 06:43:15 np0005546954 kernel: x86/mm: Memory block size: 128MB
Dec  5 06:43:15 np0005546954 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Dec  5 06:43:15 np0005546954 kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Dec  5 06:43:15 np0005546954 kernel: pinctrl core: initialized pinctrl subsystem
Dec  5 06:43:15 np0005546954 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Dec  5 06:43:15 np0005546954 kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Dec  5 06:43:15 np0005546954 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Dec  5 06:43:15 np0005546954 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Dec  5 06:43:15 np0005546954 kernel: audit: initializing netlink subsys (disabled)
Dec  5 06:43:15 np0005546954 kernel: audit: type=2000 audit(1764934993.160:1): state=initialized audit_enabled=0 res=1
Dec  5 06:43:15 np0005546954 kernel: thermal_sys: Registered thermal governor 'fair_share'
Dec  5 06:43:15 np0005546954 kernel: thermal_sys: Registered thermal governor 'step_wise'
Dec  5 06:43:15 np0005546954 kernel: thermal_sys: Registered thermal governor 'user_space'
Dec  5 06:43:15 np0005546954 kernel: cpuidle: using governor menu
Dec  5 06:43:15 np0005546954 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Dec  5 06:43:15 np0005546954 kernel: PCI: Using configuration type 1 for base access
Dec  5 06:43:15 np0005546954 kernel: PCI: Using configuration type 1 for extended access
Dec  5 06:43:15 np0005546954 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Dec  5 06:43:15 np0005546954 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Dec  5 06:43:15 np0005546954 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Dec  5 06:43:15 np0005546954 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Dec  5 06:43:15 np0005546954 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Dec  5 06:43:15 np0005546954 kernel: Demotion targets for Node 0: null
Dec  5 06:43:15 np0005546954 kernel: cryptd: max_cpu_qlen set to 1000
Dec  5 06:43:15 np0005546954 kernel: ACPI: Added _OSI(Module Device)
Dec  5 06:43:15 np0005546954 kernel: ACPI: Added _OSI(Processor Device)
Dec  5 06:43:15 np0005546954 kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Dec  5 06:43:15 np0005546954 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Dec  5 06:43:15 np0005546954 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Dec  5 06:43:15 np0005546954 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Dec  5 06:43:15 np0005546954 kernel: ACPI: Interpreter enabled
Dec  5 06:43:15 np0005546954 kernel: ACPI: PM: (supports S0 S3 S4 S5)
Dec  5 06:43:15 np0005546954 kernel: ACPI: Using IOAPIC for interrupt routing
Dec  5 06:43:15 np0005546954 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Dec  5 06:43:15 np0005546954 kernel: PCI: Using E820 reservations for host bridge windows
Dec  5 06:43:15 np0005546954 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Dec  5 06:43:15 np0005546954 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Dec  5 06:43:15 np0005546954 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Dec  5 06:43:15 np0005546954 kernel: acpiphp: Slot [3] registered
Dec  5 06:43:15 np0005546954 kernel: acpiphp: Slot [4] registered
Dec  5 06:43:15 np0005546954 kernel: acpiphp: Slot [5] registered
Dec  5 06:43:15 np0005546954 kernel: acpiphp: Slot [6] registered
Dec  5 06:43:15 np0005546954 kernel: acpiphp: Slot [7] registered
Dec  5 06:43:15 np0005546954 kernel: acpiphp: Slot [8] registered
Dec  5 06:43:15 np0005546954 kernel: acpiphp: Slot [9] registered
Dec  5 06:43:15 np0005546954 kernel: acpiphp: Slot [10] registered
Dec  5 06:43:15 np0005546954 kernel: acpiphp: Slot [11] registered
Dec  5 06:43:15 np0005546954 kernel: acpiphp: Slot [12] registered
Dec  5 06:43:15 np0005546954 kernel: acpiphp: Slot [13] registered
Dec  5 06:43:15 np0005546954 kernel: acpiphp: Slot [14] registered
Dec  5 06:43:15 np0005546954 kernel: acpiphp: Slot [15] registered
Dec  5 06:43:15 np0005546954 kernel: acpiphp: Slot [16] registered
Dec  5 06:43:15 np0005546954 kernel: acpiphp: Slot [17] registered
Dec  5 06:43:15 np0005546954 kernel: acpiphp: Slot [18] registered
Dec  5 06:43:15 np0005546954 kernel: acpiphp: Slot [19] registered
Dec  5 06:43:15 np0005546954 kernel: acpiphp: Slot [20] registered
Dec  5 06:43:15 np0005546954 kernel: acpiphp: Slot [21] registered
Dec  5 06:43:15 np0005546954 kernel: acpiphp: Slot [22] registered
Dec  5 06:43:15 np0005546954 kernel: acpiphp: Slot [23] registered
Dec  5 06:43:15 np0005546954 kernel: acpiphp: Slot [24] registered
Dec  5 06:43:15 np0005546954 kernel: acpiphp: Slot [25] registered
Dec  5 06:43:15 np0005546954 kernel: acpiphp: Slot [26] registered
Dec  5 06:43:15 np0005546954 kernel: acpiphp: Slot [27] registered
Dec  5 06:43:15 np0005546954 kernel: acpiphp: Slot [28] registered
Dec  5 06:43:15 np0005546954 kernel: acpiphp: Slot [29] registered
Dec  5 06:43:15 np0005546954 kernel: acpiphp: Slot [30] registered
Dec  5 06:43:15 np0005546954 kernel: acpiphp: Slot [31] registered
Dec  5 06:43:15 np0005546954 kernel: PCI host bridge to bus 0000:00
Dec  5 06:43:15 np0005546954 kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Dec  5 06:43:15 np0005546954 kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Dec  5 06:43:15 np0005546954 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Dec  5 06:43:15 np0005546954 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Dec  5 06:43:15 np0005546954 kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Dec  5 06:43:15 np0005546954 kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Dec  5 06:43:15 np0005546954 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Dec  5 06:43:15 np0005546954 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Dec  5 06:43:15 np0005546954 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Dec  5 06:43:15 np0005546954 kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Dec  5 06:43:15 np0005546954 kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Dec  5 06:43:15 np0005546954 kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Dec  5 06:43:15 np0005546954 kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Dec  5 06:43:15 np0005546954 kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Dec  5 06:43:15 np0005546954 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Dec  5 06:43:15 np0005546954 kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Dec  5 06:43:15 np0005546954 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Dec  5 06:43:15 np0005546954 kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Dec  5 06:43:15 np0005546954 kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Dec  5 06:43:15 np0005546954 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Dec  5 06:43:15 np0005546954 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Dec  5 06:43:15 np0005546954 kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Dec  5 06:43:15 np0005546954 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Dec  5 06:43:15 np0005546954 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Dec  5 06:43:15 np0005546954 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Dec  5 06:43:15 np0005546954 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Dec  5 06:43:15 np0005546954 kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Dec  5 06:43:15 np0005546954 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Dec  5 06:43:15 np0005546954 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Dec  5 06:43:15 np0005546954 kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Dec  5 06:43:15 np0005546954 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Dec  5 06:43:15 np0005546954 kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Dec  5 06:43:15 np0005546954 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Dec  5 06:43:15 np0005546954 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Dec  5 06:43:15 np0005546954 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Dec  5 06:43:15 np0005546954 kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Dec  5 06:43:15 np0005546954 kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Dec  5 06:43:15 np0005546954 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Dec  5 06:43:15 np0005546954 kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Dec  5 06:43:15 np0005546954 kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Dec  5 06:43:15 np0005546954 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Dec  5 06:43:15 np0005546954 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Dec  5 06:43:15 np0005546954 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Dec  5 06:43:15 np0005546954 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Dec  5 06:43:15 np0005546954 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Dec  5 06:43:15 np0005546954 kernel: iommu: Default domain type: Translated
Dec  5 06:43:15 np0005546954 kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Dec  5 06:43:15 np0005546954 kernel: SCSI subsystem initialized
Dec  5 06:43:15 np0005546954 kernel: ACPI: bus type USB registered
Dec  5 06:43:15 np0005546954 kernel: usbcore: registered new interface driver usbfs
Dec  5 06:43:15 np0005546954 kernel: usbcore: registered new interface driver hub
Dec  5 06:43:15 np0005546954 kernel: usbcore: registered new device driver usb
Dec  5 06:43:15 np0005546954 kernel: pps_core: LinuxPPS API ver. 1 registered
Dec  5 06:43:15 np0005546954 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Dec  5 06:43:15 np0005546954 kernel: PTP clock support registered
Dec  5 06:43:15 np0005546954 kernel: EDAC MC: Ver: 3.0.0
Dec  5 06:43:15 np0005546954 kernel: NetLabel: Initializing
Dec  5 06:43:15 np0005546954 kernel: NetLabel:  domain hash size = 128
Dec  5 06:43:15 np0005546954 kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Dec  5 06:43:15 np0005546954 kernel: NetLabel:  unlabeled traffic allowed by default
Dec  5 06:43:15 np0005546954 kernel: PCI: Using ACPI for IRQ routing
Dec  5 06:43:15 np0005546954 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Dec  5 06:43:15 np0005546954 kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Dec  5 06:43:15 np0005546954 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Dec  5 06:43:15 np0005546954 kernel: vgaarb: loaded
Dec  5 06:43:15 np0005546954 kernel: clocksource: Switched to clocksource kvm-clock
Dec  5 06:43:15 np0005546954 kernel: VFS: Disk quotas dquot_6.6.0
Dec  5 06:43:15 np0005546954 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Dec  5 06:43:15 np0005546954 kernel: pnp: PnP ACPI init
Dec  5 06:43:15 np0005546954 kernel: pnp: PnP ACPI: found 5 devices
Dec  5 06:43:15 np0005546954 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Dec  5 06:43:15 np0005546954 kernel: NET: Registered PF_INET protocol family
Dec  5 06:43:15 np0005546954 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Dec  5 06:43:15 np0005546954 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Dec  5 06:43:15 np0005546954 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Dec  5 06:43:15 np0005546954 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Dec  5 06:43:15 np0005546954 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Dec  5 06:43:15 np0005546954 kernel: TCP: Hash tables configured (established 65536 bind 65536)
Dec  5 06:43:15 np0005546954 kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Dec  5 06:43:15 np0005546954 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Dec  5 06:43:15 np0005546954 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Dec  5 06:43:15 np0005546954 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Dec  5 06:43:15 np0005546954 kernel: NET: Registered PF_XDP protocol family
Dec  5 06:43:15 np0005546954 kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Dec  5 06:43:15 np0005546954 kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Dec  5 06:43:15 np0005546954 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Dec  5 06:43:15 np0005546954 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Dec  5 06:43:15 np0005546954 kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Dec  5 06:43:15 np0005546954 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Dec  5 06:43:15 np0005546954 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Dec  5 06:43:15 np0005546954 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Dec  5 06:43:15 np0005546954 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 77864 usecs
Dec  5 06:43:15 np0005546954 kernel: PCI: CLS 0 bytes, default 64
Dec  5 06:43:15 np0005546954 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Dec  5 06:43:15 np0005546954 kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Dec  5 06:43:15 np0005546954 kernel: Trying to unpack rootfs image as initramfs...
Dec  5 06:43:15 np0005546954 kernel: ACPI: bus type thunderbolt registered
Dec  5 06:43:15 np0005546954 kernel: Initialise system trusted keyrings
Dec  5 06:43:15 np0005546954 kernel: Key type blacklist registered
Dec  5 06:43:15 np0005546954 kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Dec  5 06:43:15 np0005546954 kernel: zbud: loaded
Dec  5 06:43:15 np0005546954 kernel: integrity: Platform Keyring initialized
Dec  5 06:43:15 np0005546954 kernel: integrity: Machine keyring initialized
Dec  5 06:43:15 np0005546954 kernel: Freeing initrd memory: 87804K
Dec  5 06:43:15 np0005546954 kernel: NET: Registered PF_ALG protocol family
Dec  5 06:43:15 np0005546954 kernel: xor: automatically using best checksumming function   avx       
Dec  5 06:43:15 np0005546954 kernel: Key type asymmetric registered
Dec  5 06:43:15 np0005546954 kernel: Asymmetric key parser 'x509' registered
Dec  5 06:43:15 np0005546954 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Dec  5 06:43:15 np0005546954 kernel: io scheduler mq-deadline registered
Dec  5 06:43:15 np0005546954 kernel: io scheduler kyber registered
Dec  5 06:43:15 np0005546954 kernel: io scheduler bfq registered
Dec  5 06:43:15 np0005546954 kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Dec  5 06:43:15 np0005546954 kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Dec  5 06:43:15 np0005546954 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Dec  5 06:43:15 np0005546954 kernel: ACPI: button: Power Button [PWRF]
Dec  5 06:43:15 np0005546954 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Dec  5 06:43:15 np0005546954 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Dec  5 06:43:15 np0005546954 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Dec  5 06:43:15 np0005546954 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Dec  5 06:43:15 np0005546954 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Dec  5 06:43:15 np0005546954 kernel: Non-volatile memory driver v1.3
Dec  5 06:43:15 np0005546954 kernel: rdac: device handler registered
Dec  5 06:43:15 np0005546954 kernel: hp_sw: device handler registered
Dec  5 06:43:15 np0005546954 kernel: emc: device handler registered
Dec  5 06:43:15 np0005546954 kernel: alua: device handler registered
Dec  5 06:43:15 np0005546954 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Dec  5 06:43:15 np0005546954 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Dec  5 06:43:15 np0005546954 kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Dec  5 06:43:15 np0005546954 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Dec  5 06:43:15 np0005546954 kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Dec  5 06:43:15 np0005546954 kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Dec  5 06:43:15 np0005546954 kernel: usb usb1: Product: UHCI Host Controller
Dec  5 06:43:15 np0005546954 kernel: usb usb1: Manufacturer: Linux 5.14.0-645.el9.x86_64 uhci_hcd
Dec  5 06:43:15 np0005546954 kernel: usb usb1: SerialNumber: 0000:00:01.2
Dec  5 06:43:15 np0005546954 kernel: hub 1-0:1.0: USB hub found
Dec  5 06:43:15 np0005546954 kernel: hub 1-0:1.0: 2 ports detected
Dec  5 06:43:15 np0005546954 kernel: usbcore: registered new interface driver usbserial_generic
Dec  5 06:43:15 np0005546954 kernel: usbserial: USB Serial support registered for generic
Dec  5 06:43:15 np0005546954 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Dec  5 06:43:15 np0005546954 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Dec  5 06:43:15 np0005546954 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Dec  5 06:43:15 np0005546954 kernel: mousedev: PS/2 mouse device common for all mice
Dec  5 06:43:15 np0005546954 kernel: rtc_cmos 00:04: RTC can wake from S4
Dec  5 06:43:15 np0005546954 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Dec  5 06:43:15 np0005546954 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Dec  5 06:43:15 np0005546954 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Dec  5 06:43:15 np0005546954 kernel: rtc_cmos 00:04: registered as rtc0
Dec  5 06:43:15 np0005546954 kernel: rtc_cmos 00:04: setting system clock to 2025-12-05T11:43:14 UTC (1764934994)
Dec  5 06:43:15 np0005546954 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Dec  5 06:43:15 np0005546954 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Dec  5 06:43:15 np0005546954 kernel: hid: raw HID events driver (C) Jiri Kosina
Dec  5 06:43:15 np0005546954 kernel: usbcore: registered new interface driver usbhid
Dec  5 06:43:15 np0005546954 kernel: usbhid: USB HID core driver
Dec  5 06:43:15 np0005546954 kernel: drop_monitor: Initializing network drop monitor service
Dec  5 06:43:15 np0005546954 kernel: Initializing XFRM netlink socket
Dec  5 06:43:15 np0005546954 kernel: NET: Registered PF_INET6 protocol family
Dec  5 06:43:15 np0005546954 kernel: Segment Routing with IPv6
Dec  5 06:43:15 np0005546954 kernel: NET: Registered PF_PACKET protocol family
Dec  5 06:43:15 np0005546954 kernel: mpls_gso: MPLS GSO support
Dec  5 06:43:15 np0005546954 kernel: IPI shorthand broadcast: enabled
Dec  5 06:43:15 np0005546954 kernel: AVX2 version of gcm_enc/dec engaged.
Dec  5 06:43:15 np0005546954 kernel: AES CTR mode by8 optimization enabled
Dec  5 06:43:15 np0005546954 kernel: sched_clock: Marking stable (1226006799, 145115199)->(1494185586, -123063588)
Dec  5 06:43:15 np0005546954 kernel: registered taskstats version 1
Dec  5 06:43:15 np0005546954 kernel: Loading compiled-in X.509 certificates
Dec  5 06:43:15 np0005546954 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4c28336b4850d771d036b52fb2778fdb4f02f708'
Dec  5 06:43:15 np0005546954 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Dec  5 06:43:15 np0005546954 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Dec  5 06:43:15 np0005546954 kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Dec  5 06:43:15 np0005546954 kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Dec  5 06:43:15 np0005546954 kernel: Demotion targets for Node 0: null
Dec  5 06:43:15 np0005546954 kernel: page_owner is disabled
Dec  5 06:43:15 np0005546954 kernel: Key type .fscrypt registered
Dec  5 06:43:15 np0005546954 kernel: Key type fscrypt-provisioning registered
Dec  5 06:43:15 np0005546954 kernel: Key type big_key registered
Dec  5 06:43:15 np0005546954 kernel: Key type encrypted registered
Dec  5 06:43:15 np0005546954 kernel: ima: No TPM chip found, activating TPM-bypass!
Dec  5 06:43:15 np0005546954 kernel: Loading compiled-in module X.509 certificates
Dec  5 06:43:15 np0005546954 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4c28336b4850d771d036b52fb2778fdb4f02f708'
Dec  5 06:43:15 np0005546954 kernel: ima: Allocated hash algorithm: sha256
Dec  5 06:43:15 np0005546954 kernel: ima: No architecture policies found
Dec  5 06:43:15 np0005546954 kernel: evm: Initialising EVM extended attributes:
Dec  5 06:43:15 np0005546954 kernel: evm: security.selinux
Dec  5 06:43:15 np0005546954 kernel: evm: security.SMACK64 (disabled)
Dec  5 06:43:15 np0005546954 kernel: evm: security.SMACK64EXEC (disabled)
Dec  5 06:43:15 np0005546954 kernel: evm: security.SMACK64TRANSMUTE (disabled)
Dec  5 06:43:15 np0005546954 kernel: evm: security.SMACK64MMAP (disabled)
Dec  5 06:43:15 np0005546954 kernel: evm: security.apparmor (disabled)
Dec  5 06:43:15 np0005546954 kernel: evm: security.ima
Dec  5 06:43:15 np0005546954 kernel: evm: security.capability
Dec  5 06:43:15 np0005546954 kernel: evm: HMAC attrs: 0x1
Dec  5 06:43:15 np0005546954 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Dec  5 06:43:15 np0005546954 kernel: Running certificate verification RSA selftest
Dec  5 06:43:15 np0005546954 kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Dec  5 06:43:15 np0005546954 kernel: Running certificate verification ECDSA selftest
Dec  5 06:43:15 np0005546954 kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Dec  5 06:43:15 np0005546954 kernel: clk: Disabling unused clocks
Dec  5 06:43:15 np0005546954 kernel: Freeing unused decrypted memory: 2028K
Dec  5 06:43:15 np0005546954 kernel: Freeing unused kernel image (initmem) memory: 4196K
Dec  5 06:43:15 np0005546954 kernel: Write protecting the kernel read-only data: 30720k
Dec  5 06:43:15 np0005546954 kernel: Freeing unused kernel image (rodata/data gap) memory: 428K
Dec  5 06:43:15 np0005546954 kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Dec  5 06:43:15 np0005546954 kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Dec  5 06:43:15 np0005546954 kernel: usb 1-1: Product: QEMU USB Tablet
Dec  5 06:43:15 np0005546954 kernel: usb 1-1: Manufacturer: QEMU
Dec  5 06:43:15 np0005546954 kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Dec  5 06:43:15 np0005546954 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Dec  5 06:43:15 np0005546954 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Dec  5 06:43:15 np0005546954 kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Dec  5 06:43:15 np0005546954 kernel: Run /init as init process
Dec  5 06:43:15 np0005546954 systemd: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec  5 06:43:15 np0005546954 systemd: Detected virtualization kvm.
Dec  5 06:43:15 np0005546954 systemd: Detected architecture x86-64.
Dec  5 06:43:15 np0005546954 systemd: Running in initrd.
Dec  5 06:43:15 np0005546954 systemd: No hostname configured, using default hostname.
Dec  5 06:43:15 np0005546954 systemd: Hostname set to <localhost>.
Dec  5 06:43:15 np0005546954 systemd: Initializing machine ID from VM UUID.
Dec  5 06:43:15 np0005546954 systemd: Queued start job for default target Initrd Default Target.
Dec  5 06:43:15 np0005546954 systemd: Started Dispatch Password Requests to Console Directory Watch.
Dec  5 06:43:15 np0005546954 systemd: Reached target Local Encrypted Volumes.
Dec  5 06:43:15 np0005546954 systemd: Reached target Initrd /usr File System.
Dec  5 06:43:15 np0005546954 systemd: Reached target Local File Systems.
Dec  5 06:43:15 np0005546954 systemd: Reached target Path Units.
Dec  5 06:43:15 np0005546954 systemd: Reached target Slice Units.
Dec  5 06:43:15 np0005546954 systemd: Reached target Swaps.
Dec  5 06:43:15 np0005546954 systemd: Reached target Timer Units.
Dec  5 06:43:15 np0005546954 systemd: Listening on D-Bus System Message Bus Socket.
Dec  5 06:43:15 np0005546954 systemd: Listening on Journal Socket (/dev/log).
Dec  5 06:43:15 np0005546954 systemd: Listening on Journal Socket.
Dec  5 06:43:15 np0005546954 systemd: Listening on udev Control Socket.
Dec  5 06:43:15 np0005546954 systemd: Listening on udev Kernel Socket.
Dec  5 06:43:15 np0005546954 systemd: Reached target Socket Units.
Dec  5 06:43:15 np0005546954 systemd: Starting Create List of Static Device Nodes...
Dec  5 06:43:15 np0005546954 systemd: Starting Journal Service...
Dec  5 06:43:15 np0005546954 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Dec  5 06:43:15 np0005546954 systemd: Starting Apply Kernel Variables...
Dec  5 06:43:15 np0005546954 systemd: Starting Create System Users...
Dec  5 06:43:15 np0005546954 systemd: Starting Setup Virtual Console...
Dec  5 06:43:15 np0005546954 systemd: Finished Create List of Static Device Nodes.
Dec  5 06:43:15 np0005546954 systemd: Finished Apply Kernel Variables.
Dec  5 06:43:15 np0005546954 systemd: Finished Create System Users.
Dec  5 06:43:15 np0005546954 systemd: Starting Create Static Device Nodes in /dev...
Dec  5 06:43:15 np0005546954 systemd-journald[305]: Journal started
Dec  5 06:43:15 np0005546954 systemd-journald[305]: Runtime Journal (/run/log/journal/a16ad662d4264c8c9ec3e00cbbaff345) is 8.0M, max 153.6M, 145.6M free.
Dec  5 06:43:15 np0005546954 systemd-sysusers[309]: Creating group 'users' with GID 100.
Dec  5 06:43:15 np0005546954 systemd-sysusers[309]: Creating group 'dbus' with GID 81.
Dec  5 06:43:15 np0005546954 systemd-sysusers[309]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Dec  5 06:43:15 np0005546954 systemd: Started Journal Service.
Dec  5 06:43:15 np0005546954 systemd[1]: Starting Create Volatile Files and Directories...
Dec  5 06:43:15 np0005546954 systemd[1]: Finished Create Static Device Nodes in /dev.
Dec  5 06:43:15 np0005546954 systemd[1]: Finished Create Volatile Files and Directories.
Dec  5 06:43:15 np0005546954 systemd[1]: Finished Setup Virtual Console.
Dec  5 06:43:15 np0005546954 systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Dec  5 06:43:15 np0005546954 systemd[1]: Starting dracut cmdline hook...
Dec  5 06:43:15 np0005546954 dracut-cmdline[324]: dracut-9 dracut-057-102.git20250818.el9
Dec  5 06:43:15 np0005546954 dracut-cmdline[324]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64 root=UUID=fcf6b761-831a-48a7-9f5f-068b5063763f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec  5 06:43:15 np0005546954 systemd[1]: Finished dracut cmdline hook.
Dec  5 06:43:15 np0005546954 systemd[1]: Starting dracut pre-udev hook...
Dec  5 06:43:15 np0005546954 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Dec  5 06:43:15 np0005546954 kernel: device-mapper: uevent: version 1.0.3
Dec  5 06:43:15 np0005546954 kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Dec  5 06:43:15 np0005546954 kernel: RPC: Registered named UNIX socket transport module.
Dec  5 06:43:15 np0005546954 kernel: RPC: Registered udp transport module.
Dec  5 06:43:15 np0005546954 kernel: RPC: Registered tcp transport module.
Dec  5 06:43:15 np0005546954 kernel: RPC: Registered tcp-with-tls transport module.
Dec  5 06:43:15 np0005546954 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Dec  5 06:43:15 np0005546954 rpc.statd[442]: Version 2.5.4 starting
Dec  5 06:43:15 np0005546954 rpc.statd[442]: Initializing NSM state
Dec  5 06:43:15 np0005546954 rpc.idmapd[447]: Setting log level to 0
Dec  5 06:43:15 np0005546954 systemd[1]: Finished dracut pre-udev hook.
Dec  5 06:43:15 np0005546954 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec  5 06:43:15 np0005546954 systemd-udevd[460]: Using default interface naming scheme 'rhel-9.0'.
Dec  5 06:43:15 np0005546954 systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec  5 06:43:15 np0005546954 systemd[1]: Starting dracut pre-trigger hook...
Dec  5 06:43:15 np0005546954 systemd[1]: Finished dracut pre-trigger hook.
Dec  5 06:43:15 np0005546954 systemd[1]: Starting Coldplug All udev Devices...
Dec  5 06:43:15 np0005546954 systemd[1]: Created slice Slice /system/modprobe.
Dec  5 06:43:15 np0005546954 systemd[1]: Starting Load Kernel Module configfs...
Dec  5 06:43:15 np0005546954 systemd[1]: Finished Coldplug All udev Devices.
Dec  5 06:43:15 np0005546954 systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec  5 06:43:15 np0005546954 systemd[1]: Reached target Network.
Dec  5 06:43:15 np0005546954 systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec  5 06:43:15 np0005546954 systemd[1]: Starting dracut initqueue hook...
Dec  5 06:43:15 np0005546954 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec  5 06:43:15 np0005546954 systemd[1]: Finished Load Kernel Module configfs.
Dec  5 06:43:16 np0005546954 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Dec  5 06:43:16 np0005546954 systemd-udevd[481]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 06:43:16 np0005546954 kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Dec  5 06:43:16 np0005546954 kernel: vda: vda1
Dec  5 06:43:16 np0005546954 kernel: scsi host0: ata_piix
Dec  5 06:43:16 np0005546954 kernel: scsi host1: ata_piix
Dec  5 06:43:16 np0005546954 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Dec  5 06:43:16 np0005546954 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Dec  5 06:43:16 np0005546954 systemd[1]: Found device /dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f.
Dec  5 06:43:16 np0005546954 systemd[1]: Reached target Initrd Root Device.
Dec  5 06:43:16 np0005546954 systemd[1]: Mounting Kernel Configuration File System...
Dec  5 06:43:16 np0005546954 systemd[1]: Mounted Kernel Configuration File System.
Dec  5 06:43:16 np0005546954 systemd[1]: Reached target System Initialization.
Dec  5 06:43:16 np0005546954 systemd[1]: Reached target Basic System.
Dec  5 06:43:16 np0005546954 kernel: ata1: found unknown device (class 0)
Dec  5 06:43:16 np0005546954 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Dec  5 06:43:16 np0005546954 kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Dec  5 06:43:16 np0005546954 kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Dec  5 06:43:16 np0005546954 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Dec  5 06:43:16 np0005546954 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Dec  5 06:43:16 np0005546954 systemd[1]: Finished dracut initqueue hook.
Dec  5 06:43:16 np0005546954 systemd[1]: Reached target Preparation for Remote File Systems.
Dec  5 06:43:16 np0005546954 systemd[1]: Reached target Remote Encrypted Volumes.
Dec  5 06:43:16 np0005546954 systemd[1]: Reached target Remote File Systems.
Dec  5 06:43:16 np0005546954 systemd[1]: Starting dracut pre-mount hook...
Dec  5 06:43:16 np0005546954 systemd[1]: Finished dracut pre-mount hook.
Dec  5 06:43:16 np0005546954 systemd[1]: Starting File System Check on /dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f...
Dec  5 06:43:16 np0005546954 systemd-fsck[552]: /usr/sbin/fsck.xfs: XFS file system.
Dec  5 06:43:16 np0005546954 systemd[1]: Finished File System Check on /dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f.
Dec  5 06:43:16 np0005546954 systemd[1]: Mounting /sysroot...
Dec  5 06:43:16 np0005546954 kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Dec  5 06:43:16 np0005546954 kernel: XFS (vda1): Mounting V5 Filesystem fcf6b761-831a-48a7-9f5f-068b5063763f
Dec  5 06:43:17 np0005546954 kernel: XFS (vda1): Ending clean mount
Dec  5 06:43:17 np0005546954 systemd[1]: Mounted /sysroot.
Dec  5 06:43:17 np0005546954 systemd[1]: Reached target Initrd Root File System.
Dec  5 06:43:17 np0005546954 systemd[1]: Starting Mountpoints Configured in the Real Root...
Dec  5 06:43:17 np0005546954 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Dec  5 06:43:17 np0005546954 systemd[1]: Finished Mountpoints Configured in the Real Root.
Dec  5 06:43:17 np0005546954 systemd[1]: Reached target Initrd File Systems.
Dec  5 06:43:17 np0005546954 systemd[1]: Reached target Initrd Default Target.
Dec  5 06:43:17 np0005546954 systemd[1]: Starting dracut mount hook...
Dec  5 06:43:17 np0005546954 systemd[1]: Finished dracut mount hook.
Dec  5 06:43:17 np0005546954 systemd[1]: Starting dracut pre-pivot and cleanup hook...
Dec  5 06:43:17 np0005546954 rpc.idmapd[447]: exiting on signal 15
Dec  5 06:43:17 np0005546954 systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Dec  5 06:43:17 np0005546954 systemd[1]: Finished dracut pre-pivot and cleanup hook.
Dec  5 06:43:17 np0005546954 systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Dec  5 06:43:18 np0005546954 systemd[1]: Stopped target Network.
Dec  5 06:43:18 np0005546954 systemd[1]: Stopped target Remote Encrypted Volumes.
Dec  5 06:43:18 np0005546954 systemd[1]: Stopped target Timer Units.
Dec  5 06:43:18 np0005546954 systemd[1]: dbus.socket: Deactivated successfully.
Dec  5 06:43:18 np0005546954 systemd[1]: Closed D-Bus System Message Bus Socket.
Dec  5 06:43:18 np0005546954 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Dec  5 06:43:18 np0005546954 systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Dec  5 06:43:18 np0005546954 systemd[1]: Stopped target Initrd Default Target.
Dec  5 06:43:18 np0005546954 systemd[1]: Stopped target Basic System.
Dec  5 06:43:18 np0005546954 systemd[1]: Stopped target Initrd Root Device.
Dec  5 06:43:18 np0005546954 systemd[1]: Stopped target Initrd /usr File System.
Dec  5 06:43:18 np0005546954 systemd[1]: Stopped target Path Units.
Dec  5 06:43:18 np0005546954 systemd[1]: Stopped target Remote File Systems.
Dec  5 06:43:18 np0005546954 systemd[1]: Stopped target Preparation for Remote File Systems.
Dec  5 06:43:18 np0005546954 systemd[1]: Stopped target Slice Units.
Dec  5 06:43:18 np0005546954 systemd[1]: Stopped target Socket Units.
Dec  5 06:43:18 np0005546954 systemd[1]: Stopped target System Initialization.
Dec  5 06:43:18 np0005546954 systemd[1]: Stopped target Local File Systems.
Dec  5 06:43:18 np0005546954 systemd[1]: Stopped target Swaps.
Dec  5 06:43:18 np0005546954 systemd[1]: dracut-mount.service: Deactivated successfully.
Dec  5 06:43:18 np0005546954 systemd[1]: Stopped dracut mount hook.
Dec  5 06:43:18 np0005546954 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Dec  5 06:43:18 np0005546954 systemd[1]: Stopped dracut pre-mount hook.
Dec  5 06:43:18 np0005546954 systemd[1]: Stopped target Local Encrypted Volumes.
Dec  5 06:43:18 np0005546954 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Dec  5 06:43:18 np0005546954 systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Dec  5 06:43:18 np0005546954 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Dec  5 06:43:18 np0005546954 systemd[1]: Stopped dracut initqueue hook.
Dec  5 06:43:18 np0005546954 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec  5 06:43:18 np0005546954 systemd[1]: Stopped Apply Kernel Variables.
Dec  5 06:43:18 np0005546954 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Dec  5 06:43:18 np0005546954 systemd[1]: Stopped Create Volatile Files and Directories.
Dec  5 06:43:18 np0005546954 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Dec  5 06:43:18 np0005546954 systemd[1]: Stopped Coldplug All udev Devices.
Dec  5 06:43:18 np0005546954 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Dec  5 06:43:18 np0005546954 systemd[1]: Stopped dracut pre-trigger hook.
Dec  5 06:43:18 np0005546954 systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Dec  5 06:43:18 np0005546954 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Dec  5 06:43:18 np0005546954 systemd[1]: Stopped Setup Virtual Console.
Dec  5 06:43:18 np0005546954 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Dec  5 06:43:18 np0005546954 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec  5 06:43:18 np0005546954 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Dec  5 06:43:18 np0005546954 systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Dec  5 06:43:18 np0005546954 systemd[1]: systemd-udevd.service: Deactivated successfully.
Dec  5 06:43:18 np0005546954 systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Dec  5 06:43:18 np0005546954 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Dec  5 06:43:18 np0005546954 systemd[1]: Closed udev Control Socket.
Dec  5 06:43:18 np0005546954 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Dec  5 06:43:18 np0005546954 systemd[1]: Closed udev Kernel Socket.
Dec  5 06:43:18 np0005546954 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Dec  5 06:43:18 np0005546954 systemd[1]: Stopped dracut pre-udev hook.
Dec  5 06:43:18 np0005546954 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Dec  5 06:43:18 np0005546954 systemd[1]: Stopped dracut cmdline hook.
Dec  5 06:43:18 np0005546954 systemd[1]: Starting Cleanup udev Database...
Dec  5 06:43:18 np0005546954 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Dec  5 06:43:18 np0005546954 systemd[1]: Stopped Create Static Device Nodes in /dev.
Dec  5 06:43:18 np0005546954 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Dec  5 06:43:18 np0005546954 systemd[1]: Stopped Create List of Static Device Nodes.
Dec  5 06:43:18 np0005546954 systemd[1]: systemd-sysusers.service: Deactivated successfully.
Dec  5 06:43:18 np0005546954 systemd[1]: Stopped Create System Users.
Dec  5 06:43:18 np0005546954 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Dec  5 06:43:18 np0005546954 systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Dec  5 06:43:18 np0005546954 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Dec  5 06:43:18 np0005546954 systemd[1]: Finished Cleanup udev Database.
Dec  5 06:43:18 np0005546954 systemd[1]: Reached target Switch Root.
Dec  5 06:43:18 np0005546954 systemd[1]: Starting Switch Root...
Dec  5 06:43:18 np0005546954 systemd[1]: Switching root.
Dec  5 06:43:18 np0005546954 systemd-journald[305]: Journal stopped
Dec  5 06:43:18 np0005546954 systemd-journald: Received SIGTERM from PID 1 (systemd).
Dec  5 06:43:18 np0005546954 kernel: audit: type=1404 audit(1764934998.339:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Dec  5 06:43:18 np0005546954 kernel: SELinux:  policy capability network_peer_controls=1
Dec  5 06:43:18 np0005546954 kernel: SELinux:  policy capability open_perms=1
Dec  5 06:43:18 np0005546954 kernel: SELinux:  policy capability extended_socket_class=1
Dec  5 06:43:18 np0005546954 kernel: SELinux:  policy capability always_check_network=0
Dec  5 06:43:18 np0005546954 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  5 06:43:18 np0005546954 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  5 06:43:18 np0005546954 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  5 06:43:18 np0005546954 kernel: audit: type=1403 audit(1764934998.471:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Dec  5 06:43:18 np0005546954 systemd: Successfully loaded SELinux policy in 137.896ms.
Dec  5 06:43:18 np0005546954 systemd: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 27.459ms.
Dec  5 06:43:18 np0005546954 systemd: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec  5 06:43:18 np0005546954 systemd: Detected virtualization kvm.
Dec  5 06:43:18 np0005546954 systemd: Detected architecture x86-64.
Dec  5 06:43:18 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:43:18 np0005546954 systemd: initrd-switch-root.service: Deactivated successfully.
Dec  5 06:43:18 np0005546954 systemd: Stopped Switch Root.
Dec  5 06:43:18 np0005546954 systemd: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Dec  5 06:43:18 np0005546954 systemd: Created slice Slice /system/getty.
Dec  5 06:43:18 np0005546954 systemd: Created slice Slice /system/serial-getty.
Dec  5 06:43:18 np0005546954 systemd: Created slice Slice /system/sshd-keygen.
Dec  5 06:43:18 np0005546954 systemd: Created slice User and Session Slice.
Dec  5 06:43:18 np0005546954 systemd: Started Dispatch Password Requests to Console Directory Watch.
Dec  5 06:43:18 np0005546954 systemd: Started Forward Password Requests to Wall Directory Watch.
Dec  5 06:43:18 np0005546954 systemd: Set up automount Arbitrary Executable File Formats File System Automount Point.
Dec  5 06:43:18 np0005546954 systemd: Reached target Local Encrypted Volumes.
Dec  5 06:43:18 np0005546954 systemd: Stopped target Switch Root.
Dec  5 06:43:18 np0005546954 systemd: Stopped target Initrd File Systems.
Dec  5 06:43:18 np0005546954 systemd: Stopped target Initrd Root File System.
Dec  5 06:43:18 np0005546954 systemd: Reached target Local Integrity Protected Volumes.
Dec  5 06:43:18 np0005546954 systemd: Reached target Path Units.
Dec  5 06:43:18 np0005546954 systemd: Reached target rpc_pipefs.target.
Dec  5 06:43:18 np0005546954 systemd: Reached target Slice Units.
Dec  5 06:43:18 np0005546954 systemd: Reached target Swaps.
Dec  5 06:43:18 np0005546954 systemd: Reached target Local Verity Protected Volumes.
Dec  5 06:43:18 np0005546954 systemd: Listening on RPCbind Server Activation Socket.
Dec  5 06:43:18 np0005546954 systemd: Reached target RPC Port Mapper.
Dec  5 06:43:18 np0005546954 systemd: Listening on Process Core Dump Socket.
Dec  5 06:43:18 np0005546954 systemd: Listening on initctl Compatibility Named Pipe.
Dec  5 06:43:18 np0005546954 systemd: Listening on udev Control Socket.
Dec  5 06:43:18 np0005546954 systemd: Listening on udev Kernel Socket.
Dec  5 06:43:18 np0005546954 systemd: Mounting Huge Pages File System...
Dec  5 06:43:18 np0005546954 systemd: Mounting POSIX Message Queue File System...
Dec  5 06:43:18 np0005546954 systemd: Mounting Kernel Debug File System...
Dec  5 06:43:18 np0005546954 systemd: Mounting Kernel Trace File System...
Dec  5 06:43:18 np0005546954 systemd: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec  5 06:43:18 np0005546954 systemd: Starting Create List of Static Device Nodes...
Dec  5 06:43:18 np0005546954 systemd: Starting Load Kernel Module configfs...
Dec  5 06:43:18 np0005546954 systemd: Starting Load Kernel Module drm...
Dec  5 06:43:18 np0005546954 systemd: Starting Load Kernel Module efi_pstore...
Dec  5 06:43:18 np0005546954 systemd: Starting Load Kernel Module fuse...
Dec  5 06:43:18 np0005546954 systemd: Starting Read and set NIS domainname from /etc/sysconfig/network...
Dec  5 06:43:18 np0005546954 systemd: systemd-fsck-root.service: Deactivated successfully.
Dec  5 06:43:18 np0005546954 systemd: Stopped File System Check on Root Device.
Dec  5 06:43:18 np0005546954 systemd: Stopped Journal Service.
Dec  5 06:43:18 np0005546954 systemd: Starting Journal Service...
Dec  5 06:43:18 np0005546954 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Dec  5 06:43:18 np0005546954 systemd: Starting Generate network units from Kernel command line...
Dec  5 06:43:18 np0005546954 systemd: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec  5 06:43:18 np0005546954 systemd: Starting Remount Root and Kernel File Systems...
Dec  5 06:43:18 np0005546954 systemd: Repartition Root Disk was skipped because no trigger condition checks were met.
Dec  5 06:43:18 np0005546954 systemd: Starting Apply Kernel Variables...
Dec  5 06:43:18 np0005546954 kernel: fuse: init (API version 7.37)
Dec  5 06:43:18 np0005546954 systemd: Starting Coldplug All udev Devices...
Dec  5 06:43:18 np0005546954 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Dec  5 06:43:18 np0005546954 systemd: Mounted Huge Pages File System.
Dec  5 06:43:18 np0005546954 systemd-journald[674]: Journal started
Dec  5 06:43:18 np0005546954 systemd-journald[674]: Runtime Journal (/run/log/journal/4d4ef2323cc3337bbfd9081b2a323b4e) is 8.0M, max 153.6M, 145.6M free.
Dec  5 06:43:18 np0005546954 systemd[1]: Queued start job for default target Multi-User System.
Dec  5 06:43:18 np0005546954 systemd[1]: systemd-journald.service: Deactivated successfully.
Dec  5 06:43:18 np0005546954 systemd: Started Journal Service.
Dec  5 06:43:18 np0005546954 systemd[1]: Mounted POSIX Message Queue File System.
Dec  5 06:43:18 np0005546954 systemd[1]: Mounted Kernel Debug File System.
Dec  5 06:43:18 np0005546954 systemd[1]: Mounted Kernel Trace File System.
Dec  5 06:43:18 np0005546954 systemd[1]: Finished Create List of Static Device Nodes.
Dec  5 06:43:18 np0005546954 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec  5 06:43:18 np0005546954 systemd[1]: Finished Load Kernel Module configfs.
Dec  5 06:43:18 np0005546954 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Dec  5 06:43:18 np0005546954 systemd[1]: Finished Load Kernel Module efi_pstore.
Dec  5 06:43:18 np0005546954 systemd[1]: modprobe@fuse.service: Deactivated successfully.
Dec  5 06:43:18 np0005546954 systemd[1]: Finished Load Kernel Module fuse.
Dec  5 06:43:18 np0005546954 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Dec  5 06:43:18 np0005546954 systemd[1]: Finished Generate network units from Kernel command line.
Dec  5 06:43:18 np0005546954 systemd[1]: Finished Remount Root and Kernel File Systems.
Dec  5 06:43:18 np0005546954 systemd[1]: Finished Apply Kernel Variables.
Dec  5 06:43:18 np0005546954 kernel: ACPI: bus type drm_connector registered
Dec  5 06:43:18 np0005546954 systemd[1]: Mounting FUSE Control File System...
Dec  5 06:43:18 np0005546954 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec  5 06:43:18 np0005546954 systemd[1]: Starting Rebuild Hardware Database...
Dec  5 06:43:18 np0005546954 systemd[1]: Starting Flush Journal to Persistent Storage...
Dec  5 06:43:18 np0005546954 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Dec  5 06:43:18 np0005546954 systemd[1]: Starting Load/Save OS Random Seed...
Dec  5 06:43:18 np0005546954 systemd[1]: Starting Create System Users...
Dec  5 06:43:18 np0005546954 systemd-journald[674]: Runtime Journal (/run/log/journal/4d4ef2323cc3337bbfd9081b2a323b4e) is 8.0M, max 153.6M, 145.6M free.
Dec  5 06:43:18 np0005546954 systemd-journald[674]: Received client request to flush runtime journal.
Dec  5 06:43:18 np0005546954 systemd[1]: modprobe@drm.service: Deactivated successfully.
Dec  5 06:43:18 np0005546954 systemd[1]: Finished Load Kernel Module drm.
Dec  5 06:43:18 np0005546954 systemd[1]: Mounted FUSE Control File System.
Dec  5 06:43:18 np0005546954 systemd[1]: Finished Flush Journal to Persistent Storage.
Dec  5 06:43:19 np0005546954 systemd[1]: Finished Coldplug All udev Devices.
Dec  5 06:43:19 np0005546954 systemd[1]: Finished Load/Save OS Random Seed.
Dec  5 06:43:19 np0005546954 systemd[1]: Finished Create System Users.
Dec  5 06:43:19 np0005546954 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec  5 06:43:19 np0005546954 systemd[1]: Starting Create Static Device Nodes in /dev...
Dec  5 06:43:19 np0005546954 systemd[1]: Finished Create Static Device Nodes in /dev.
Dec  5 06:43:19 np0005546954 systemd[1]: Reached target Preparation for Local File Systems.
Dec  5 06:43:19 np0005546954 systemd[1]: Reached target Local File Systems.
Dec  5 06:43:19 np0005546954 systemd[1]: Starting Rebuild Dynamic Linker Cache...
Dec  5 06:43:19 np0005546954 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Dec  5 06:43:19 np0005546954 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Dec  5 06:43:19 np0005546954 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Dec  5 06:43:19 np0005546954 systemd[1]: Starting Automatic Boot Loader Update...
Dec  5 06:43:19 np0005546954 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Dec  5 06:43:19 np0005546954 systemd[1]: Starting Create Volatile Files and Directories...
Dec  5 06:43:19 np0005546954 bootctl[693]: Couldn't find EFI system partition, skipping.
Dec  5 06:43:19 np0005546954 systemd[1]: Finished Automatic Boot Loader Update.
Dec  5 06:43:19 np0005546954 systemd[1]: Finished Rebuild Dynamic Linker Cache.
Dec  5 06:43:19 np0005546954 systemd[1]: Finished Create Volatile Files and Directories.
Dec  5 06:43:19 np0005546954 systemd[1]: Starting Security Auditing Service...
Dec  5 06:43:19 np0005546954 systemd[1]: Starting RPC Bind...
Dec  5 06:43:19 np0005546954 systemd[1]: Starting Rebuild Journal Catalog...
Dec  5 06:43:19 np0005546954 auditd[699]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Dec  5 06:43:19 np0005546954 auditd[699]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Dec  5 06:43:19 np0005546954 systemd[1]: Started RPC Bind.
Dec  5 06:43:19 np0005546954 systemd[1]: Finished Rebuild Journal Catalog.
Dec  5 06:43:19 np0005546954 augenrules[704]: /sbin/augenrules: No change
Dec  5 06:43:19 np0005546954 augenrules[719]: No rules
Dec  5 06:43:19 np0005546954 augenrules[719]: enabled 1
Dec  5 06:43:19 np0005546954 augenrules[719]: failure 1
Dec  5 06:43:19 np0005546954 augenrules[719]: pid 699
Dec  5 06:43:19 np0005546954 augenrules[719]: rate_limit 0
Dec  5 06:43:19 np0005546954 augenrules[719]: backlog_limit 8192
Dec  5 06:43:19 np0005546954 augenrules[719]: lost 0
Dec  5 06:43:19 np0005546954 augenrules[719]: backlog 0
Dec  5 06:43:19 np0005546954 augenrules[719]: backlog_wait_time 60000
Dec  5 06:43:19 np0005546954 augenrules[719]: backlog_wait_time_actual 0
Dec  5 06:43:19 np0005546954 augenrules[719]: enabled 1
Dec  5 06:43:19 np0005546954 augenrules[719]: failure 1
Dec  5 06:43:19 np0005546954 augenrules[719]: pid 699
Dec  5 06:43:19 np0005546954 augenrules[719]: rate_limit 0
Dec  5 06:43:19 np0005546954 augenrules[719]: backlog_limit 8192
Dec  5 06:43:19 np0005546954 augenrules[719]: lost 0
Dec  5 06:43:19 np0005546954 augenrules[719]: backlog 0
Dec  5 06:43:19 np0005546954 augenrules[719]: backlog_wait_time 60000
Dec  5 06:43:19 np0005546954 augenrules[719]: backlog_wait_time_actual 0
Dec  5 06:43:19 np0005546954 augenrules[719]: enabled 1
Dec  5 06:43:19 np0005546954 augenrules[719]: failure 1
Dec  5 06:43:19 np0005546954 augenrules[719]: pid 699
Dec  5 06:43:19 np0005546954 augenrules[719]: rate_limit 0
Dec  5 06:43:19 np0005546954 augenrules[719]: backlog_limit 8192
Dec  5 06:43:19 np0005546954 augenrules[719]: lost 0
Dec  5 06:43:19 np0005546954 augenrules[719]: backlog 0
Dec  5 06:43:19 np0005546954 augenrules[719]: backlog_wait_time 60000
Dec  5 06:43:19 np0005546954 augenrules[719]: backlog_wait_time_actual 0
Dec  5 06:43:19 np0005546954 systemd[1]: Started Security Auditing Service.
Dec  5 06:43:19 np0005546954 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Dec  5 06:43:19 np0005546954 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Dec  5 06:43:19 np0005546954 systemd[1]: Finished Rebuild Hardware Database.
Dec  5 06:43:19 np0005546954 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec  5 06:43:19 np0005546954 systemd[1]: Starting Update is Completed...
Dec  5 06:43:19 np0005546954 systemd[1]: Finished Update is Completed.
Dec  5 06:43:19 np0005546954 systemd-udevd[727]: Using default interface naming scheme 'rhel-9.0'.
Dec  5 06:43:19 np0005546954 systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec  5 06:43:19 np0005546954 systemd[1]: Reached target System Initialization.
Dec  5 06:43:19 np0005546954 systemd[1]: Started dnf makecache --timer.
Dec  5 06:43:19 np0005546954 systemd[1]: Started Daily rotation of log files.
Dec  5 06:43:19 np0005546954 systemd[1]: Started Daily Cleanup of Temporary Directories.
Dec  5 06:43:19 np0005546954 systemd[1]: Reached target Timer Units.
Dec  5 06:43:19 np0005546954 systemd[1]: Listening on D-Bus System Message Bus Socket.
Dec  5 06:43:19 np0005546954 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Dec  5 06:43:19 np0005546954 systemd[1]: Reached target Socket Units.
Dec  5 06:43:19 np0005546954 systemd[1]: Starting D-Bus System Message Bus...
Dec  5 06:43:19 np0005546954 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec  5 06:43:19 np0005546954 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Dec  5 06:43:19 np0005546954 systemd[1]: Starting Load Kernel Module configfs...
Dec  5 06:43:19 np0005546954 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec  5 06:43:19 np0005546954 systemd[1]: Finished Load Kernel Module configfs.
Dec  5 06:43:19 np0005546954 systemd-udevd[732]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 06:43:19 np0005546954 systemd[1]: Started D-Bus System Message Bus.
Dec  5 06:43:19 np0005546954 systemd[1]: Reached target Basic System.
Dec  5 06:43:19 np0005546954 dbus-broker-lau[738]: Ready
Dec  5 06:43:19 np0005546954 systemd[1]: Starting NTP client/server...
Dec  5 06:43:19 np0005546954 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Dec  5 06:43:19 np0005546954 systemd[1]: Starting Restore /run/initramfs on shutdown...
Dec  5 06:43:19 np0005546954 systemd[1]: Starting IPv4 firewall with iptables...
Dec  5 06:43:19 np0005546954 systemd[1]: Started irqbalance daemon.
Dec  5 06:43:19 np0005546954 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Dec  5 06:43:19 np0005546954 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  5 06:43:19 np0005546954 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  5 06:43:19 np0005546954 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  5 06:43:19 np0005546954 systemd[1]: Reached target sshd-keygen.target.
Dec  5 06:43:19 np0005546954 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Dec  5 06:43:19 np0005546954 systemd[1]: Reached target User and Group Name Lookups.
Dec  5 06:43:19 np0005546954 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Dec  5 06:43:19 np0005546954 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Dec  5 06:43:19 np0005546954 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Dec  5 06:43:19 np0005546954 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Dec  5 06:43:19 np0005546954 systemd[1]: Starting User Login Management...
Dec  5 06:43:19 np0005546954 systemd[1]: Finished Restore /run/initramfs on shutdown.
Dec  5 06:43:19 np0005546954 chronyd[791]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Dec  5 06:43:19 np0005546954 chronyd[791]: Loaded 0 symmetric keys
Dec  5 06:43:19 np0005546954 chronyd[791]: Using right/UTC timezone to obtain leap second data
Dec  5 06:43:19 np0005546954 chronyd[791]: Loaded seccomp filter (level 2)
Dec  5 06:43:19 np0005546954 systemd[1]: Started NTP client/server.
Dec  5 06:43:19 np0005546954 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Dec  5 06:43:20 np0005546954 systemd-logind[789]: Watching system buttons on /dev/input/event0 (Power Button)
Dec  5 06:43:20 np0005546954 systemd-logind[789]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec  5 06:43:20 np0005546954 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Dec  5 06:43:20 np0005546954 systemd-logind[789]: New seat seat0.
Dec  5 06:43:20 np0005546954 systemd[1]: Started User Login Management.
Dec  5 06:43:20 np0005546954 kernel: kvm_amd: TSC scaling supported
Dec  5 06:43:20 np0005546954 kernel: kvm_amd: Nested Virtualization enabled
Dec  5 06:43:20 np0005546954 kernel: kvm_amd: Nested Paging enabled
Dec  5 06:43:20 np0005546954 kernel: kvm_amd: LBR virtualization supported
Dec  5 06:43:20 np0005546954 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Dec  5 06:43:20 np0005546954 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Dec  5 06:43:20 np0005546954 kernel: Console: switching to colour dummy device 80x25
Dec  5 06:43:20 np0005546954 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Dec  5 06:43:20 np0005546954 kernel: [drm] features: -context_init
Dec  5 06:43:20 np0005546954 kernel: [drm] number of scanouts: 1
Dec  5 06:43:20 np0005546954 kernel: [drm] number of cap sets: 0
Dec  5 06:43:20 np0005546954 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Dec  5 06:43:20 np0005546954 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Dec  5 06:43:20 np0005546954 kernel: Console: switching to colour frame buffer device 128x48
Dec  5 06:43:20 np0005546954 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Dec  5 06:43:20 np0005546954 iptables.init[774]: iptables: Applying firewall rules: [  OK  ]
Dec  5 06:43:20 np0005546954 systemd[1]: Finished IPv4 firewall with iptables.
Dec  5 06:43:20 np0005546954 cloud-init[835]: Cloud-init v. 24.4-7.el9 running 'init-local' at Fri, 05 Dec 2025 11:43:20 +0000. Up 7.02 seconds.
Dec  5 06:43:20 np0005546954 systemd[1]: run-cloud\x2dinit-tmp-tmpm_7iwois.mount: Deactivated successfully.
Dec  5 06:43:20 np0005546954 systemd[1]: Starting Hostname Service...
Dec  5 06:43:20 np0005546954 systemd[1]: Started Hostname Service.
Dec  5 06:43:20 np0005546954 systemd-hostnamed[849]: Hostname set to <np0005546954.novalocal> (static)
Dec  5 06:43:20 np0005546954 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Dec  5 06:43:20 np0005546954 systemd[1]: Reached target Preparation for Network.
Dec  5 06:43:20 np0005546954 systemd[1]: Starting Network Manager...
Dec  5 06:43:20 np0005546954 NetworkManager[854]: <info>  [1764935000.8733] NetworkManager (version 1.54.1-1.el9) is starting... (boot:46eaa6a6-b96c-4b3b-a171-c5d47450a30e)
Dec  5 06:43:20 np0005546954 NetworkManager[854]: <info>  [1764935000.8739] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec  5 06:43:20 np0005546954 NetworkManager[854]: <info>  [1764935000.8819] manager[0x557205ccf080]: monitoring kernel firmware directory '/lib/firmware'.
Dec  5 06:43:20 np0005546954 NetworkManager[854]: <info>  [1764935000.8854] hostname: hostname: using hostnamed
Dec  5 06:43:20 np0005546954 NetworkManager[854]: <info>  [1764935000.8855] hostname: static hostname changed from (none) to "np0005546954.novalocal"
Dec  5 06:43:20 np0005546954 NetworkManager[854]: <info>  [1764935000.8858] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec  5 06:43:20 np0005546954 NetworkManager[854]: <info>  [1764935000.8974] manager[0x557205ccf080]: rfkill: Wi-Fi hardware radio set enabled
Dec  5 06:43:20 np0005546954 NetworkManager[854]: <info>  [1764935000.8975] manager[0x557205ccf080]: rfkill: WWAN hardware radio set enabled
Dec  5 06:43:20 np0005546954 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Dec  5 06:43:20 np0005546954 NetworkManager[854]: <info>  [1764935000.9013] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec  5 06:43:20 np0005546954 NetworkManager[854]: <info>  [1764935000.9013] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec  5 06:43:20 np0005546954 NetworkManager[854]: <info>  [1764935000.9014] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec  5 06:43:20 np0005546954 NetworkManager[854]: <info>  [1764935000.9014] manager: Networking is enabled by state file
Dec  5 06:43:20 np0005546954 NetworkManager[854]: <info>  [1764935000.9016] settings: Loaded settings plugin: keyfile (internal)
Dec  5 06:43:20 np0005546954 NetworkManager[854]: <info>  [1764935000.9031] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec  5 06:43:20 np0005546954 NetworkManager[854]: <info>  [1764935000.9047] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec  5 06:43:20 np0005546954 NetworkManager[854]: <info>  [1764935000.9058] dhcp: init: Using DHCP client 'internal'
Dec  5 06:43:20 np0005546954 NetworkManager[854]: <info>  [1764935000.9060] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec  5 06:43:20 np0005546954 NetworkManager[854]: <info>  [1764935000.9073] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 06:43:20 np0005546954 NetworkManager[854]: <info>  [1764935000.9081] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec  5 06:43:20 np0005546954 NetworkManager[854]: <info>  [1764935000.9088] device (lo): Activation: starting connection 'lo' (5db90527-565f-43c8-b47d-5e445792da38)
Dec  5 06:43:20 np0005546954 NetworkManager[854]: <info>  [1764935000.9097] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec  5 06:43:20 np0005546954 NetworkManager[854]: <info>  [1764935000.9103] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  5 06:43:20 np0005546954 NetworkManager[854]: <info>  [1764935000.9127] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec  5 06:43:20 np0005546954 NetworkManager[854]: <info>  [1764935000.9131] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec  5 06:43:20 np0005546954 NetworkManager[854]: <info>  [1764935000.9132] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec  5 06:43:20 np0005546954 NetworkManager[854]: <info>  [1764935000.9134] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec  5 06:43:20 np0005546954 NetworkManager[854]: <info>  [1764935000.9135] device (eth0): carrier: link connected
Dec  5 06:43:20 np0005546954 NetworkManager[854]: <info>  [1764935000.9136] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec  5 06:43:20 np0005546954 NetworkManager[854]: <info>  [1764935000.9141] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Dec  5 06:43:20 np0005546954 NetworkManager[854]: <info>  [1764935000.9147] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec  5 06:43:20 np0005546954 NetworkManager[854]: <info>  [1764935000.9150] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec  5 06:43:20 np0005546954 NetworkManager[854]: <info>  [1764935000.9150] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  5 06:43:20 np0005546954 NetworkManager[854]: <info>  [1764935000.9151] manager: NetworkManager state is now CONNECTING
Dec  5 06:43:20 np0005546954 NetworkManager[854]: <info>  [1764935000.9152] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  5 06:43:20 np0005546954 NetworkManager[854]: <info>  [1764935000.9157] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  5 06:43:20 np0005546954 NetworkManager[854]: <info>  [1764935000.9159] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  5 06:43:20 np0005546954 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  5 06:43:20 np0005546954 systemd[1]: Started Network Manager.
Dec  5 06:43:20 np0005546954 systemd[1]: Reached target Network.
Dec  5 06:43:20 np0005546954 systemd[1]: Starting Network Manager Wait Online...
Dec  5 06:43:20 np0005546954 systemd[1]: Starting GSSAPI Proxy Daemon...
Dec  5 06:43:20 np0005546954 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  5 06:43:20 np0005546954 NetworkManager[854]: <info>  [1764935000.9463] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec  5 06:43:20 np0005546954 NetworkManager[854]: <info>  [1764935000.9466] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec  5 06:43:20 np0005546954 NetworkManager[854]: <info>  [1764935000.9471] device (lo): Activation: successful, device activated.
Dec  5 06:43:20 np0005546954 systemd[1]: Started GSSAPI Proxy Daemon.
Dec  5 06:43:20 np0005546954 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec  5 06:43:20 np0005546954 systemd[1]: Reached target NFS client services.
Dec  5 06:43:20 np0005546954 systemd[1]: Reached target Preparation for Remote File Systems.
Dec  5 06:43:20 np0005546954 systemd[1]: Reached target Remote File Systems.
Dec  5 06:43:20 np0005546954 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec  5 06:43:21 np0005546954 NetworkManager[854]: <info>  [1764935001.8736] dhcp4 (eth0): state changed new lease, address=38.102.83.243
Dec  5 06:43:21 np0005546954 NetworkManager[854]: <info>  [1764935001.8752] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec  5 06:43:21 np0005546954 NetworkManager[854]: <info>  [1764935001.8805] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  5 06:43:21 np0005546954 NetworkManager[854]: <info>  [1764935001.8834] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  5 06:43:21 np0005546954 NetworkManager[854]: <info>  [1764935001.8836] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  5 06:43:21 np0005546954 NetworkManager[854]: <info>  [1764935001.8840] manager: NetworkManager state is now CONNECTED_SITE
Dec  5 06:43:21 np0005546954 NetworkManager[854]: <info>  [1764935001.8844] device (eth0): Activation: successful, device activated.
Dec  5 06:43:21 np0005546954 NetworkManager[854]: <info>  [1764935001.8850] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec  5 06:43:21 np0005546954 NetworkManager[854]: <info>  [1764935001.8854] manager: startup complete
Dec  5 06:43:21 np0005546954 systemd[1]: Finished Network Manager Wait Online.
Dec  5 06:43:21 np0005546954 systemd[1]: Starting Cloud-init: Network Stage...
Dec  5 06:43:22 np0005546954 cloud-init[917]: Cloud-init v. 24.4-7.el9 running 'init' at Fri, 05 Dec 2025 11:43:22 +0000. Up 8.88 seconds.
Dec  5 06:43:22 np0005546954 cloud-init[917]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Dec  5 06:43:22 np0005546954 cloud-init[917]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec  5 06:43:22 np0005546954 cloud-init[917]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Dec  5 06:43:22 np0005546954 cloud-init[917]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec  5 06:43:22 np0005546954 cloud-init[917]: ci-info: |  eth0  | True |        38.102.83.243         | 255.255.255.0 | global | fa:16:3e:19:ac:62 |
Dec  5 06:43:22 np0005546954 cloud-init[917]: ci-info: |  eth0  | True | fe80::f816:3eff:fe19:ac62/64 |       .       |  link  | fa:16:3e:19:ac:62 |
Dec  5 06:43:22 np0005546954 cloud-init[917]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Dec  5 06:43:22 np0005546954 cloud-init[917]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Dec  5 06:43:22 np0005546954 cloud-init[917]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec  5 06:43:22 np0005546954 cloud-init[917]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Dec  5 06:43:22 np0005546954 cloud-init[917]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec  5 06:43:22 np0005546954 cloud-init[917]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Dec  5 06:43:22 np0005546954 cloud-init[917]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec  5 06:43:22 np0005546954 cloud-init[917]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Dec  5 06:43:22 np0005546954 cloud-init[917]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Dec  5 06:43:22 np0005546954 cloud-init[917]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Dec  5 06:43:22 np0005546954 cloud-init[917]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec  5 06:43:22 np0005546954 cloud-init[917]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Dec  5 06:43:22 np0005546954 cloud-init[917]: ci-info: +-------+-------------+---------+-----------+-------+
Dec  5 06:43:22 np0005546954 cloud-init[917]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Dec  5 06:43:22 np0005546954 cloud-init[917]: ci-info: +-------+-------------+---------+-----------+-------+
Dec  5 06:43:22 np0005546954 cloud-init[917]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Dec  5 06:43:22 np0005546954 cloud-init[917]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Dec  5 06:43:22 np0005546954 cloud-init[917]: ci-info: +-------+-------------+---------+-----------+-------+
Dec  5 06:43:23 np0005546954 cloud-init[917]: Generating public/private rsa key pair.
Dec  5 06:43:23 np0005546954 cloud-init[917]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Dec  5 06:43:23 np0005546954 cloud-init[917]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Dec  5 06:43:23 np0005546954 cloud-init[917]: The key fingerprint is:
Dec  5 06:43:23 np0005546954 cloud-init[917]: SHA256:KPUvqNrrIl7gObpdCNSWi3FotTZ9vTN8HWoi53jHQ9Q root@np0005546954.novalocal
Dec  5 06:43:23 np0005546954 cloud-init[917]: The key's randomart image is:
Dec  5 06:43:23 np0005546954 cloud-init[917]: +---[RSA 3072]----+
Dec  5 06:43:23 np0005546954 cloud-init[917]: |   .             |
Dec  5 06:43:23 np0005546954 cloud-init[917]: |  + +   .   .    |
Dec  5 06:43:23 np0005546954 cloud-init[917]: | = O ... . . E   |
Dec  5 06:43:23 np0005546954 cloud-init[917]: |o * o..o. o o .  |
Dec  5 06:43:23 np0005546954 cloud-init[917]: |.o .. ..SB = .   |
Dec  5 06:43:23 np0005546954 cloud-init[917]: |..o. . .=.O      |
Dec  5 06:43:23 np0005546954 cloud-init[917]: | +... ...o.+     |
Dec  5 06:43:23 np0005546954 cloud-init[917]: |o.+o .  ... .    |
Dec  5 06:43:23 np0005546954 cloud-init[917]: |++o+=.           |
Dec  5 06:43:23 np0005546954 cloud-init[917]: +----[SHA256]-----+
Dec  5 06:43:23 np0005546954 cloud-init[917]: Generating public/private ecdsa key pair.
Dec  5 06:43:23 np0005546954 cloud-init[917]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Dec  5 06:43:23 np0005546954 cloud-init[917]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Dec  5 06:43:23 np0005546954 cloud-init[917]: The key fingerprint is:
Dec  5 06:43:23 np0005546954 cloud-init[917]: SHA256:RXS6aIzC6moc87tpr41Ns8+XorDx6EiuVypC8vCGw1M root@np0005546954.novalocal
Dec  5 06:43:23 np0005546954 cloud-init[917]: The key's randomart image is:
Dec  5 06:43:23 np0005546954 cloud-init[917]: +---[ECDSA 256]---+
Dec  5 06:43:23 np0005546954 cloud-init[917]: |         .o .    |
Dec  5 06:43:23 np0005546954 cloud-init[917]: |         . o     |
Dec  5 06:43:23 np0005546954 cloud-init[917]: |          o      |
Dec  5 06:43:23 np0005546954 cloud-init[917]: |   .   o o .     |
Dec  5 06:43:23 np0005546954 cloud-init[917]: |    o . S .      |
Dec  5 06:43:23 np0005546954 cloud-init[917]: |o+ E.. .         |
Dec  5 06:43:23 np0005546954 cloud-init[917]: |=**= o    .      |
Dec  5 06:43:23 np0005546954 cloud-init[917]: |B*Bo% +. o       |
Dec  5 06:43:23 np0005546954 cloud-init[917]: |*O=X=Booo        |
Dec  5 06:43:23 np0005546954 cloud-init[917]: +----[SHA256]-----+
Dec  5 06:43:23 np0005546954 cloud-init[917]: Generating public/private ed25519 key pair.
Dec  5 06:43:23 np0005546954 cloud-init[917]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Dec  5 06:43:23 np0005546954 cloud-init[917]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Dec  5 06:43:23 np0005546954 cloud-init[917]: The key fingerprint is:
Dec  5 06:43:23 np0005546954 cloud-init[917]: SHA256:XOzvn2DZ7ILqWbbqdHX7wcdf4iURZFLC0T7MTQcP7A8 root@np0005546954.novalocal
Dec  5 06:43:23 np0005546954 cloud-init[917]: The key's randomart image is:
Dec  5 06:43:23 np0005546954 cloud-init[917]: +--[ED25519 256]--+
Dec  5 06:43:23 np0005546954 cloud-init[917]: |           .+=*. |
Dec  5 06:43:23 np0005546954 cloud-init[917]: |         .  .=ooo|
Dec  5 06:43:23 np0005546954 cloud-init[917]: |          o  =.oo|
Dec  5 06:43:23 np0005546954 cloud-init[917]: |       . o    E..|
Dec  5 06:43:23 np0005546954 cloud-init[917]: |        S . . o+ |
Dec  5 06:43:23 np0005546954 cloud-init[917]: |           o =.+.|
Dec  5 06:43:23 np0005546954 cloud-init[917]: |        . +.= *o=|
Dec  5 06:43:23 np0005546954 cloud-init[917]: |       . =.+.+ *=|
Dec  5 06:43:23 np0005546954 cloud-init[917]: |       o*o. .o= o|
Dec  5 06:43:23 np0005546954 cloud-init[917]: +----[SHA256]-----+
Dec  5 06:43:23 np0005546954 systemd[1]: Finished Cloud-init: Network Stage.
Dec  5 06:43:23 np0005546954 systemd[1]: Reached target Cloud-config availability.
Dec  5 06:43:23 np0005546954 systemd[1]: Reached target Network is Online.
Dec  5 06:43:23 np0005546954 systemd[1]: Starting Cloud-init: Config Stage...
Dec  5 06:43:23 np0005546954 systemd[1]: Starting Crash recovery kernel arming...
Dec  5 06:43:23 np0005546954 systemd[1]: Starting Notify NFS peers of a restart...
Dec  5 06:43:23 np0005546954 systemd[1]: Starting System Logging Service...
Dec  5 06:43:23 np0005546954 systemd[1]: Starting OpenSSH server daemon...
Dec  5 06:43:23 np0005546954 sm-notify[1001]: Version 2.5.4 starting
Dec  5 06:43:23 np0005546954 systemd[1]: Starting Permit User Sessions...
Dec  5 06:43:23 np0005546954 systemd[1]: Started Notify NFS peers of a restart.
Dec  5 06:43:23 np0005546954 systemd[1]: Finished Permit User Sessions.
Dec  5 06:43:23 np0005546954 systemd[1]: Started Command Scheduler.
Dec  5 06:43:23 np0005546954 systemd[1]: Started Getty on tty1.
Dec  5 06:43:23 np0005546954 systemd[1]: Started Serial Getty on ttyS0.
Dec  5 06:43:23 np0005546954 systemd[1]: Reached target Login Prompts.
Dec  5 06:43:23 np0005546954 systemd[1]: Started OpenSSH server daemon.
Dec  5 06:43:23 np0005546954 systemd[1]: Started System Logging Service.
Dec  5 06:43:23 np0005546954 rsyslogd[1002]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1002" x-info="https://www.rsyslog.com"] start
Dec  5 06:43:23 np0005546954 rsyslogd[1002]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Dec  5 06:43:23 np0005546954 systemd[1]: Reached target Multi-User System.
Dec  5 06:43:23 np0005546954 systemd[1]: Starting Record Runlevel Change in UTMP...
Dec  5 06:43:23 np0005546954 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Dec  5 06:43:23 np0005546954 systemd[1]: Finished Record Runlevel Change in UTMP.
Dec  5 06:43:24 np0005546954 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  5 06:43:24 np0005546954 kdumpctl[1011]: kdump: No kdump initial ramdisk found.
Dec  5 06:43:24 np0005546954 kdumpctl[1011]: kdump: Rebuilding /boot/initramfs-5.14.0-645.el9.x86_64kdump.img
Dec  5 06:43:24 np0005546954 cloud-init[1134]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Fri, 05 Dec 2025 11:43:24 +0000. Up 10.86 seconds.
Dec  5 06:43:24 np0005546954 systemd[1]: Finished Cloud-init: Config Stage.
Dec  5 06:43:24 np0005546954 systemd[1]: Starting Cloud-init: Final Stage...
Dec  5 06:43:24 np0005546954 dracut[1262]: dracut-057-102.git20250818.el9
Dec  5 06:43:24 np0005546954 cloud-init[1283]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Fri, 05 Dec 2025 11:43:24 +0000. Up 11.27 seconds.
Dec  5 06:43:24 np0005546954 dracut[1264]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-645.el9.x86_64kdump.img 5.14.0-645.el9.x86_64
Dec  5 06:43:24 np0005546954 cloud-init[1312]: #############################################################
Dec  5 06:43:24 np0005546954 cloud-init[1320]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Dec  5 06:43:24 np0005546954 cloud-init[1328]: 256 SHA256:RXS6aIzC6moc87tpr41Ns8+XorDx6EiuVypC8vCGw1M root@np0005546954.novalocal (ECDSA)
Dec  5 06:43:24 np0005546954 cloud-init[1336]: 256 SHA256:XOzvn2DZ7ILqWbbqdHX7wcdf4iURZFLC0T7MTQcP7A8 root@np0005546954.novalocal (ED25519)
Dec  5 06:43:24 np0005546954 cloud-init[1341]: 3072 SHA256:KPUvqNrrIl7gObpdCNSWi3FotTZ9vTN8HWoi53jHQ9Q root@np0005546954.novalocal (RSA)
Dec  5 06:43:24 np0005546954 cloud-init[1342]: -----END SSH HOST KEY FINGERPRINTS-----
Dec  5 06:43:24 np0005546954 cloud-init[1343]: #############################################################
Dec  5 06:43:24 np0005546954 cloud-init[1283]: Cloud-init v. 24.4-7.el9 finished at Fri, 05 Dec 2025 11:43:24 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 11.45 seconds
Dec  5 06:43:24 np0005546954 systemd[1]: Finished Cloud-init: Final Stage.
Dec  5 06:43:24 np0005546954 systemd[1]: Reached target Cloud-init target.
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: memstrack is not available
Dec  5 06:43:25 np0005546954 dracut[1264]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec  5 06:43:25 np0005546954 dracut[1264]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec  5 06:43:26 np0005546954 dracut[1264]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec  5 06:43:26 np0005546954 dracut[1264]: memstrack is not available
Dec  5 06:43:26 np0005546954 dracut[1264]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec  5 06:43:26 np0005546954 dracut[1264]: *** Including module: systemd ***
Dec  5 06:43:26 np0005546954 dracut[1264]: *** Including module: fips ***
Dec  5 06:43:26 np0005546954 dracut[1264]: *** Including module: systemd-initrd ***
Dec  5 06:43:26 np0005546954 dracut[1264]: *** Including module: i18n ***
Dec  5 06:43:26 np0005546954 dracut[1264]: *** Including module: drm ***
Dec  5 06:43:26 np0005546954 chronyd[791]: Selected source 174.138.193.90 (2.centos.pool.ntp.org)
Dec  5 06:43:26 np0005546954 chronyd[791]: System clock TAI offset set to 37 seconds
Dec  5 06:43:27 np0005546954 dracut[1264]: *** Including module: prefixdevname ***
Dec  5 06:43:27 np0005546954 dracut[1264]: *** Including module: kernel-modules ***
Dec  5 06:43:27 np0005546954 kernel: block vda: the capability attribute has been deprecated.
Dec  5 06:43:27 np0005546954 dracut[1264]: *** Including module: kernel-modules-extra ***
Dec  5 06:43:27 np0005546954 dracut[1264]: *** Including module: qemu ***
Dec  5 06:43:27 np0005546954 dracut[1264]: *** Including module: fstab-sys ***
Dec  5 06:43:27 np0005546954 dracut[1264]: *** Including module: rootfs-block ***
Dec  5 06:43:27 np0005546954 dracut[1264]: *** Including module: terminfo ***
Dec  5 06:43:27 np0005546954 dracut[1264]: *** Including module: udev-rules ***
Dec  5 06:43:28 np0005546954 dracut[1264]: Skipping udev rule: 91-permissions.rules
Dec  5 06:43:28 np0005546954 dracut[1264]: Skipping udev rule: 80-drivers-modprobe.rules
Dec  5 06:43:28 np0005546954 dracut[1264]: *** Including module: virtiofs ***
Dec  5 06:43:28 np0005546954 dracut[1264]: *** Including module: dracut-systemd ***
Dec  5 06:43:28 np0005546954 chronyd[791]: Selected source 167.160.187.12 (2.centos.pool.ntp.org)
Dec  5 06:43:28 np0005546954 dracut[1264]: *** Including module: usrmount ***
Dec  5 06:43:28 np0005546954 dracut[1264]: *** Including module: base ***
Dec  5 06:43:28 np0005546954 dracut[1264]: *** Including module: fs-lib ***
Dec  5 06:43:28 np0005546954 dracut[1264]: *** Including module: kdumpbase ***
Dec  5 06:43:29 np0005546954 dracut[1264]: *** Including module: microcode_ctl-fw_dir_override ***
Dec  5 06:43:29 np0005546954 dracut[1264]:  microcode_ctl module: mangling fw_dir
Dec  5 06:43:29 np0005546954 dracut[1264]:    microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Dec  5 06:43:29 np0005546954 dracut[1264]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Dec  5 06:43:29 np0005546954 dracut[1264]:    microcode_ctl: configuration "intel" is ignored
Dec  5 06:43:29 np0005546954 dracut[1264]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Dec  5 06:43:29 np0005546954 dracut[1264]:    microcode_ctl: configuration "intel-06-2d-07" is ignored
Dec  5 06:43:29 np0005546954 dracut[1264]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Dec  5 06:43:29 np0005546954 dracut[1264]:    microcode_ctl: configuration "intel-06-4e-03" is ignored
Dec  5 06:43:29 np0005546954 dracut[1264]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Dec  5 06:43:29 np0005546954 dracut[1264]:    microcode_ctl: configuration "intel-06-4f-01" is ignored
Dec  5 06:43:29 np0005546954 dracut[1264]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Dec  5 06:43:29 np0005546954 dracut[1264]:    microcode_ctl: configuration "intel-06-55-04" is ignored
Dec  5 06:43:29 np0005546954 dracut[1264]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Dec  5 06:43:29 np0005546954 dracut[1264]:    microcode_ctl: configuration "intel-06-5e-03" is ignored
Dec  5 06:43:29 np0005546954 dracut[1264]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Dec  5 06:43:29 np0005546954 dracut[1264]:    microcode_ctl: configuration "intel-06-8c-01" is ignored
Dec  5 06:43:29 np0005546954 dracut[1264]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Dec  5 06:43:29 np0005546954 dracut[1264]:    microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Dec  5 06:43:29 np0005546954 dracut[1264]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Dec  5 06:43:29 np0005546954 dracut[1264]:    microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Dec  5 06:43:29 np0005546954 dracut[1264]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Dec  5 06:43:29 np0005546954 dracut[1264]:    microcode_ctl: configuration "intel-06-8f-08" is ignored
Dec  5 06:43:29 np0005546954 dracut[1264]:    microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Dec  5 06:43:29 np0005546954 dracut[1264]: *** Including module: openssl ***
Dec  5 06:43:29 np0005546954 dracut[1264]: *** Including module: shutdown ***
Dec  5 06:43:29 np0005546954 dracut[1264]: *** Including module: squash ***
Dec  5 06:43:29 np0005546954 dracut[1264]: *** Including modules done ***
Dec  5 06:43:29 np0005546954 dracut[1264]: *** Installing kernel module dependencies ***
Dec  5 06:43:30 np0005546954 irqbalance[776]: Cannot change IRQ 35 affinity: Operation not permitted
Dec  5 06:43:30 np0005546954 irqbalance[776]: IRQ 35 affinity is now unmanaged
Dec  5 06:43:30 np0005546954 irqbalance[776]: Cannot change IRQ 33 affinity: Operation not permitted
Dec  5 06:43:30 np0005546954 irqbalance[776]: IRQ 33 affinity is now unmanaged
Dec  5 06:43:30 np0005546954 irqbalance[776]: Cannot change IRQ 31 affinity: Operation not permitted
Dec  5 06:43:30 np0005546954 irqbalance[776]: IRQ 31 affinity is now unmanaged
Dec  5 06:43:30 np0005546954 irqbalance[776]: Cannot change IRQ 28 affinity: Operation not permitted
Dec  5 06:43:30 np0005546954 irqbalance[776]: IRQ 28 affinity is now unmanaged
Dec  5 06:43:30 np0005546954 irqbalance[776]: Cannot change IRQ 34 affinity: Operation not permitted
Dec  5 06:43:30 np0005546954 irqbalance[776]: IRQ 34 affinity is now unmanaged
Dec  5 06:43:30 np0005546954 irqbalance[776]: Cannot change IRQ 32 affinity: Operation not permitted
Dec  5 06:43:30 np0005546954 irqbalance[776]: IRQ 32 affinity is now unmanaged
Dec  5 06:43:30 np0005546954 irqbalance[776]: Cannot change IRQ 30 affinity: Operation not permitted
Dec  5 06:43:30 np0005546954 irqbalance[776]: IRQ 30 affinity is now unmanaged
Dec  5 06:43:30 np0005546954 irqbalance[776]: Cannot change IRQ 29 affinity: Operation not permitted
Dec  5 06:43:30 np0005546954 irqbalance[776]: IRQ 29 affinity is now unmanaged
Dec  5 06:43:30 np0005546954 dracut[1264]: *** Installing kernel module dependencies done ***
Dec  5 06:43:30 np0005546954 dracut[1264]: *** Resolving executable dependencies ***
Dec  5 06:43:31 np0005546954 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  5 06:43:32 np0005546954 dracut[1264]: *** Resolving executable dependencies done ***
Dec  5 06:43:32 np0005546954 dracut[1264]: *** Generating early-microcode cpio image ***
Dec  5 06:43:32 np0005546954 dracut[1264]: *** Store current command line parameters ***
Dec  5 06:43:32 np0005546954 dracut[1264]: Stored kernel commandline:
Dec  5 06:43:32 np0005546954 dracut[1264]: No dracut internal kernel commandline stored in the initramfs
Dec  5 06:43:32 np0005546954 dracut[1264]: *** Install squash loader ***
Dec  5 06:43:33 np0005546954 dracut[1264]: *** Squashing the files inside the initramfs ***
Dec  5 06:43:34 np0005546954 dracut[1264]: *** Squashing the files inside the initramfs done ***
Dec  5 06:43:34 np0005546954 dracut[1264]: *** Creating image file '/boot/initramfs-5.14.0-645.el9.x86_64kdump.img' ***
Dec  5 06:43:34 np0005546954 dracut[1264]: *** Hardlinking files ***
Dec  5 06:43:34 np0005546954 dracut[1264]: *** Hardlinking files done ***
Dec  5 06:43:34 np0005546954 dracut[1264]: *** Creating initramfs image file '/boot/initramfs-5.14.0-645.el9.x86_64kdump.img' done ***
Dec  5 06:43:35 np0005546954 kdumpctl[1011]: kdump: kexec: loaded kdump kernel
Dec  5 06:43:35 np0005546954 kdumpctl[1011]: kdump: Starting kdump: [OK]
Dec  5 06:43:35 np0005546954 systemd[1]: Finished Crash recovery kernel arming.
Dec  5 06:43:35 np0005546954 systemd[1]: Startup finished in 1.618s (kernel) + 3.403s (initrd) + 17.058s (userspace) = 22.080s.
Dec  5 06:43:42 np0005546954 systemd[1]: Created slice User Slice of UID 1000.
Dec  5 06:43:42 np0005546954 systemd[1]: Starting User Runtime Directory /run/user/1000...
Dec  5 06:43:42 np0005546954 systemd-logind[789]: New session 1 of user zuul.
Dec  5 06:43:42 np0005546954 systemd[1]: Finished User Runtime Directory /run/user/1000.
Dec  5 06:43:42 np0005546954 systemd[1]: Starting User Manager for UID 1000...
Dec  5 06:43:42 np0005546954 systemd[4296]: Queued start job for default target Main User Target.
Dec  5 06:43:42 np0005546954 systemd[4296]: Created slice User Application Slice.
Dec  5 06:43:42 np0005546954 systemd[4296]: Started Mark boot as successful after the user session has run 2 minutes.
Dec  5 06:43:42 np0005546954 systemd[4296]: Started Daily Cleanup of User's Temporary Directories.
Dec  5 06:43:42 np0005546954 systemd[4296]: Reached target Paths.
Dec  5 06:43:42 np0005546954 systemd[4296]: Reached target Timers.
Dec  5 06:43:42 np0005546954 systemd[4296]: Starting D-Bus User Message Bus Socket...
Dec  5 06:43:42 np0005546954 systemd[4296]: Starting Create User's Volatile Files and Directories...
Dec  5 06:43:42 np0005546954 systemd[4296]: Listening on D-Bus User Message Bus Socket.
Dec  5 06:43:42 np0005546954 systemd[4296]: Reached target Sockets.
Dec  5 06:43:42 np0005546954 systemd[4296]: Finished Create User's Volatile Files and Directories.
Dec  5 06:43:42 np0005546954 systemd[4296]: Reached target Basic System.
Dec  5 06:43:42 np0005546954 systemd[4296]: Reached target Main User Target.
Dec  5 06:43:42 np0005546954 systemd[4296]: Startup finished in 140ms.
Dec  5 06:43:42 np0005546954 systemd[1]: Started User Manager for UID 1000.
Dec  5 06:43:42 np0005546954 systemd[1]: Started Session 1 of User zuul.
Dec  5 06:43:43 np0005546954 python3[4378]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 06:43:46 np0005546954 python3[4406]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 06:43:50 np0005546954 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec  5 06:43:53 np0005546954 python3[4466]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 06:43:54 np0005546954 python3[4506]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Dec  5 06:43:56 np0005546954 python3[4532]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC/yl4TxjuFUF5sQr678Zp9IdPUvobRHvquk3+hp6vPfct0AVbxQJcJWU9gW5sFfu7rQ2y/Gz3HhgmMecEVoYAtfW4n9JsjVoaB3aqJXgSZ9xjwlE3WHDc37+b4Bl7pMzM+Qy4nqETVv+f94R3lr6xaB77uv+H+oQxhvJv8r9b0z43h2Xx8HUM9aeWOJfA5mXsuw+zt3M7fvMR9g19BNMlUpasUu5fEiCyHgAG9sgfNqyXgdxS2NLQ2a9RJ1hEquI810Al7TeZKH+xndVukt4SjMwGgJbd3IWrvvpLhZYegc6nh2egFwJwDboR8H3fdntftXXhiwX4yGunfrNlLz2LK89U0T/oJJIxD1mw/e7u1ERXzJfHNFbwMMQnaY/F4P66SU2VhVfUdV7SPYKMFd5URKHgehHf/qWgz7X5MM4s6wT+YyJ4VU28oOy1kUzLC8avnCLDn6niOGt4tV51vwzSMlMfW8I9JXp0IddCsd9ht2PJG1lk+b+jo+JUo9VtIL9U= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:43:57 np0005546954 python3[4556]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:43:57 np0005546954 python3[4655]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  5 06:43:57 np0005546954 python3[4726]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764935037.2619667-230-146996928209128/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=6b21ad18a47a4abfbbceb52fb7439e14_id_rsa follow=False checksum=d1a14726dde8c20be168662b9201c87b62457be8 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:43:58 np0005546954 python3[4849]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  5 06:43:58 np0005546954 python3[4920]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764935038.1386921-274-48704150204122/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=6b21ad18a47a4abfbbceb52fb7439e14_id_rsa.pub follow=False checksum=fffa082400bc9c902b8f279b2f9e4248e62c64c2 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:44:00 np0005546954 python3[4968]: ansible-ping Invoked with data=pong
Dec  5 06:44:01 np0005546954 python3[4992]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 06:44:02 np0005546954 python3[5050]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Dec  5 06:44:03 np0005546954 python3[5082]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:44:04 np0005546954 python3[5106]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:44:04 np0005546954 python3[5130]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:44:04 np0005546954 python3[5154]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:44:05 np0005546954 python3[5178]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:44:05 np0005546954 python3[5202]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:44:07 np0005546954 python3[5228]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:44:07 np0005546954 python3[5306]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  5 06:44:08 np0005546954 python3[5379]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935047.2369409-27-228247994943918/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:44:08 np0005546954 python3[5427]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:44:09 np0005546954 python3[5451]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:44:09 np0005546954 python3[5475]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:44:09 np0005546954 python3[5499]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:44:10 np0005546954 python3[5523]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:44:10 np0005546954 python3[5547]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:44:10 np0005546954 python3[5571]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:44:11 np0005546954 python3[5595]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:44:11 np0005546954 python3[5619]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:44:11 np0005546954 python3[5643]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:44:11 np0005546954 python3[5667]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:44:12 np0005546954 python3[5691]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:44:12 np0005546954 python3[5715]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:44:12 np0005546954 python3[5739]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:44:13 np0005546954 python3[5763]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:44:13 np0005546954 python3[5787]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:44:14 np0005546954 python3[5811]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:44:14 np0005546954 python3[5835]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:44:14 np0005546954 python3[5859]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:44:15 np0005546954 python3[5883]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:44:15 np0005546954 python3[5907]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:44:15 np0005546954 python3[5931]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:44:15 np0005546954 python3[5955]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:44:16 np0005546954 python3[5979]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:44:16 np0005546954 python3[6003]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:44:16 np0005546954 python3[6027]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:44:18 np0005546954 python3[6053]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec  5 06:44:18 np0005546954 systemd[1]: Starting Time & Date Service...
Dec  5 06:44:18 np0005546954 systemd[1]: Started Time & Date Service.
Dec  5 06:44:19 np0005546954 systemd-timedated[6055]: Changed time zone to 'UTC' (UTC).
Dec  5 06:44:20 np0005546954 python3[6084]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:44:20 np0005546954 python3[6160]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  5 06:44:21 np0005546954 python3[6231]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1764935060.4782367-204-244509355335943/source _original_basename=tmpiryldur7 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:44:21 np0005546954 python3[6331]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  5 06:44:22 np0005546954 python3[6402]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764935061.3669822-244-82456783756666/source _original_basename=tmpvj2z43b9 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:44:22 np0005546954 python3[6504]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  5 06:44:23 np0005546954 python3[6577]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764935062.538517-307-232066798061451/source _original_basename=tmpjospyxd3 follow=False checksum=ee9b126fe33e72500e994fd3e8d9deaa54707872 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:44:23 np0005546954 python3[6625]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:44:24 np0005546954 python3[6651]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:44:24 np0005546954 python3[6731]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  5 06:44:25 np0005546954 python3[6804]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935064.445417-363-276155897134784/source _original_basename=tmp8mrxo9xc follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:44:25 np0005546954 python3[6855]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163efc-24cc-9b7c-d9fe-00000000001e-1-compute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:44:26 np0005546954 python3[6883]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163efc-24cc-9b7c-d9fe-00000000001f-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Dec  5 06:44:27 np0005546954 python3[6911]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:44:46 np0005546954 python3[6937]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:44:49 np0005546954 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec  5 06:45:45 np0005546954 systemd[4296]: Starting Mark boot as successful...
Dec  5 06:45:45 np0005546954 systemd[4296]: Finished Mark boot as successful.
Dec  5 06:45:46 np0005546954 systemd-logind[789]: Session 1 logged out. Waiting for processes to exit.
Dec  5 06:45:49 np0005546954 kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Dec  5 06:45:49 np0005546954 kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Dec  5 06:45:49 np0005546954 kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Dec  5 06:45:49 np0005546954 kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Dec  5 06:45:49 np0005546954 kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Dec  5 06:45:49 np0005546954 kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Dec  5 06:45:49 np0005546954 kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Dec  5 06:45:49 np0005546954 kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Dec  5 06:45:49 np0005546954 kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Dec  5 06:45:49 np0005546954 kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Dec  5 06:45:49 np0005546954 NetworkManager[854]: <info>  [1764935149.7921] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec  5 06:45:49 np0005546954 systemd-udevd[6941]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 06:45:49 np0005546954 NetworkManager[854]: <info>  [1764935149.8095] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  5 06:45:49 np0005546954 NetworkManager[854]: <info>  [1764935149.8120] settings: (eth1): created default wired connection 'Wired connection 1'
Dec  5 06:45:49 np0005546954 NetworkManager[854]: <info>  [1764935149.8123] device (eth1): carrier: link connected
Dec  5 06:45:49 np0005546954 NetworkManager[854]: <info>  [1764935149.8125] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Dec  5 06:45:49 np0005546954 NetworkManager[854]: <info>  [1764935149.8130] policy: auto-activating connection 'Wired connection 1' (389ad192-0e8f-3736-8c5b-2cd019e054e1)
Dec  5 06:45:49 np0005546954 NetworkManager[854]: <info>  [1764935149.8134] device (eth1): Activation: starting connection 'Wired connection 1' (389ad192-0e8f-3736-8c5b-2cd019e054e1)
Dec  5 06:45:49 np0005546954 NetworkManager[854]: <info>  [1764935149.8134] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  5 06:45:49 np0005546954 NetworkManager[854]: <info>  [1764935149.8137] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  5 06:45:49 np0005546954 NetworkManager[854]: <info>  [1764935149.8140] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  5 06:45:49 np0005546954 NetworkManager[854]: <info>  [1764935149.8144] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec  5 06:45:50 np0005546954 systemd-logind[789]: New session 3 of user zuul.
Dec  5 06:45:50 np0005546954 systemd[1]: Started Session 3 of User zuul.
Dec  5 06:45:51 np0005546954 python3[6972]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163efc-24cc-c45c-524b-000000000173-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:45:58 np0005546954 python3[7052]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  5 06:45:58 np0005546954 python3[7125]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764935157.7290938-154-78591813322827/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=afe566cb65da5e65375e7b286a05f403818c3ea3 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:45:58 np0005546954 python3[7175]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  5 06:45:58 np0005546954 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Dec  5 06:45:58 np0005546954 systemd[1]: Stopped Network Manager Wait Online.
Dec  5 06:45:58 np0005546954 systemd[1]: Stopping Network Manager Wait Online...
Dec  5 06:45:58 np0005546954 systemd[1]: Stopping Network Manager...
Dec  5 06:45:58 np0005546954 NetworkManager[854]: <info>  [1764935158.9590] caught SIGTERM, shutting down normally.
Dec  5 06:45:58 np0005546954 NetworkManager[854]: <info>  [1764935158.9606] dhcp4 (eth0): canceled DHCP transaction
Dec  5 06:45:58 np0005546954 NetworkManager[854]: <info>  [1764935158.9606] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  5 06:45:58 np0005546954 NetworkManager[854]: <info>  [1764935158.9606] dhcp4 (eth0): state changed no lease
Dec  5 06:45:58 np0005546954 NetworkManager[854]: <info>  [1764935158.9609] manager: NetworkManager state is now CONNECTING
Dec  5 06:45:58 np0005546954 NetworkManager[854]: <info>  [1764935158.9701] dhcp4 (eth1): canceled DHCP transaction
Dec  5 06:45:58 np0005546954 NetworkManager[854]: <info>  [1764935158.9702] dhcp4 (eth1): state changed no lease
Dec  5 06:45:58 np0005546954 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  5 06:45:58 np0005546954 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  5 06:45:58 np0005546954 NetworkManager[854]: <info>  [1764935158.9872] exiting (success)
Dec  5 06:45:59 np0005546954 systemd[1]: NetworkManager.service: Deactivated successfully.
Dec  5 06:45:59 np0005546954 systemd[1]: Stopped Network Manager.
Dec  5 06:45:59 np0005546954 systemd[1]: NetworkManager.service: Consumed 1.256s CPU time, 10.0M memory peak.
Dec  5 06:45:59 np0005546954 systemd[1]: Starting Network Manager...
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.0590] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:46eaa6a6-b96c-4b3b-a171-c5d47450a30e)
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.0592] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.0655] manager[0x55ff83bda070]: monitoring kernel firmware directory '/lib/firmware'.
Dec  5 06:45:59 np0005546954 systemd[1]: Starting Hostname Service...
Dec  5 06:45:59 np0005546954 systemd[1]: Started Hostname Service.
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1418] hostname: hostname: using hostnamed
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1419] hostname: static hostname changed from (none) to "np0005546954.novalocal"
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1432] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1443] manager[0x55ff83bda070]: rfkill: Wi-Fi hardware radio set enabled
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1444] manager[0x55ff83bda070]: rfkill: WWAN hardware radio set enabled
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1493] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1494] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1495] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1496] manager: Networking is enabled by state file
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1500] settings: Loaded settings plugin: keyfile (internal)
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1507] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1553] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1563] dhcp: init: Using DHCP client 'internal'
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1566] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1570] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1577] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1583] device (lo): Activation: starting connection 'lo' (5db90527-565f-43c8-b47d-5e445792da38)
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1587] device (eth0): carrier: link connected
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1591] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1594] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1595] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1599] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1605] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1610] device (eth1): carrier: link connected
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1613] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1617] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (389ad192-0e8f-3736-8c5b-2cd019e054e1) (indicated)
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1618] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1622] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1628] device (eth1): Activation: starting connection 'Wired connection 1' (389ad192-0e8f-3736-8c5b-2cd019e054e1)
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1634] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec  5 06:45:59 np0005546954 systemd[1]: Started Network Manager.
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1637] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1640] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1641] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1642] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1644] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1645] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1648] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1651] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1655] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1657] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1663] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1665] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1689] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1690] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1696] device (lo): Activation: successful, device activated.
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1703] dhcp4 (eth0): state changed new lease, address=38.102.83.243
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.1709] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec  5 06:45:59 np0005546954 systemd[1]: Starting Network Manager Wait Online...
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.2175] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.2275] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.2276] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.2279] manager: NetworkManager state is now CONNECTED_SITE
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.2283] device (eth0): Activation: successful, device activated.
Dec  5 06:45:59 np0005546954 NetworkManager[7188]: <info>  [1764935159.2287] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec  5 06:45:59 np0005546954 python3[7259]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163efc-24cc-c45c-524b-0000000000bd-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:46:09 np0005546954 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  5 06:46:29 np0005546954 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec  5 06:46:44 np0005546954 NetworkManager[7188]: <info>  [1764935204.3164] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec  5 06:46:44 np0005546954 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  5 06:46:44 np0005546954 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  5 06:46:44 np0005546954 NetworkManager[7188]: <info>  [1764935204.3464] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec  5 06:46:44 np0005546954 NetworkManager[7188]: <info>  [1764935204.3467] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec  5 06:46:44 np0005546954 NetworkManager[7188]: <info>  [1764935204.3473] device (eth1): Activation: successful, device activated.
Dec  5 06:46:44 np0005546954 NetworkManager[7188]: <info>  [1764935204.3487] manager: startup complete
Dec  5 06:46:44 np0005546954 NetworkManager[7188]: <info>  [1764935204.3490] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Dec  5 06:46:44 np0005546954 NetworkManager[7188]: <warn>  [1764935204.3495] device (eth1): Activation: failed for connection 'Wired connection 1'
Dec  5 06:46:44 np0005546954 NetworkManager[7188]: <info>  [1764935204.3502] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Dec  5 06:46:44 np0005546954 systemd[1]: Finished Network Manager Wait Online.
Dec  5 06:46:44 np0005546954 NetworkManager[7188]: <info>  [1764935204.3672] dhcp4 (eth1): canceled DHCP transaction
Dec  5 06:46:44 np0005546954 NetworkManager[7188]: <info>  [1764935204.3672] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec  5 06:46:44 np0005546954 NetworkManager[7188]: <info>  [1764935204.3673] dhcp4 (eth1): state changed no lease
Dec  5 06:46:44 np0005546954 NetworkManager[7188]: <info>  [1764935204.3688] policy: auto-activating connection 'ci-private-network' (561dd4dc-d770-5231-ac01-02964b3b80f6)
Dec  5 06:46:44 np0005546954 NetworkManager[7188]: <info>  [1764935204.3692] device (eth1): Activation: starting connection 'ci-private-network' (561dd4dc-d770-5231-ac01-02964b3b80f6)
Dec  5 06:46:44 np0005546954 NetworkManager[7188]: <info>  [1764935204.3693] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  5 06:46:44 np0005546954 NetworkManager[7188]: <info>  [1764935204.3696] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  5 06:46:44 np0005546954 NetworkManager[7188]: <info>  [1764935204.3703] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  5 06:46:44 np0005546954 NetworkManager[7188]: <info>  [1764935204.3711] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  5 06:46:44 np0005546954 NetworkManager[7188]: <info>  [1764935204.4385] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  5 06:46:44 np0005546954 NetworkManager[7188]: <info>  [1764935204.4388] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  5 06:46:44 np0005546954 NetworkManager[7188]: <info>  [1764935204.4395] device (eth1): Activation: successful, device activated.
Dec  5 06:46:54 np0005546954 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  5 06:46:59 np0005546954 systemd[1]: session-3.scope: Deactivated successfully.
Dec  5 06:46:59 np0005546954 systemd[1]: session-3.scope: Consumed 1.625s CPU time.
Dec  5 06:46:59 np0005546954 systemd-logind[789]: Session 3 logged out. Waiting for processes to exit.
Dec  5 06:46:59 np0005546954 systemd-logind[789]: Removed session 3.
Dec  5 06:47:06 np0005546954 systemd-logind[789]: New session 4 of user zuul.
Dec  5 06:47:06 np0005546954 systemd[1]: Started Session 4 of User zuul.
Dec  5 06:47:06 np0005546954 python3[7368]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  5 06:47:07 np0005546954 python3[7441]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935226.527851-312-252272644080776/source _original_basename=tmpmjf1k0ew follow=False checksum=f7c98fd57b6aa190c2b834d160bd229fbae4e551 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:47:09 np0005546954 systemd[1]: session-4.scope: Deactivated successfully.
Dec  5 06:47:09 np0005546954 systemd-logind[789]: Session 4 logged out. Waiting for processes to exit.
Dec  5 06:47:09 np0005546954 systemd-logind[789]: Removed session 4.
Dec  5 06:48:45 np0005546954 systemd[4296]: Created slice User Background Tasks Slice.
Dec  5 06:48:45 np0005546954 systemd[4296]: Starting Cleanup of User's Temporary Files and Directories...
Dec  5 06:48:45 np0005546954 systemd[4296]: Finished Cleanup of User's Temporary Files and Directories.
Dec  5 06:48:51 np0005546954 chronyd[791]: Selected source 174.138.193.90 (2.centos.pool.ntp.org)
Dec  5 06:54:23 np0005546954 systemd-logind[789]: New session 5 of user zuul.
Dec  5 06:54:23 np0005546954 systemd[1]: Started Session 5 of User zuul.
Dec  5 06:54:23 np0005546954 python3[7503]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163efc-24cc-da47-a048-000000001cdd-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:54:23 np0005546954 python3[7531]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:54:24 np0005546954 python3[7558]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:54:24 np0005546954 python3[7584]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:54:24 np0005546954 python3[7610]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:54:25 np0005546954 python3[7636]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:54:25 np0005546954 python3[7714]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  5 06:54:26 np0005546954 python3[7787]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935665.4109519-506-175577637321899/source _original_basename=tmpw9korm0w follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:54:26 np0005546954 python3[7837]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  5 06:54:26 np0005546954 systemd[1]: Reloading.
Dec  5 06:54:26 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:54:28 np0005546954 python3[7894]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Dec  5 06:54:29 np0005546954 python3[7920]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:54:29 np0005546954 python3[7948]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:54:29 np0005546954 python3[7976]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:54:29 np0005546954 python3[8004]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:54:30 np0005546954 python3[8031]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163efc-24cc-da47-a048-000000001ce4-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:54:30 np0005546954 python3[8061]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec  5 06:54:33 np0005546954 systemd[1]: session-5.scope: Deactivated successfully.
Dec  5 06:54:33 np0005546954 systemd[1]: session-5.scope: Consumed 4.195s CPU time.
Dec  5 06:54:33 np0005546954 systemd-logind[789]: Session 5 logged out. Waiting for processes to exit.
Dec  5 06:54:33 np0005546954 systemd-logind[789]: Removed session 5.
Dec  5 06:54:34 np0005546954 systemd-logind[789]: New session 6 of user zuul.
Dec  5 06:54:35 np0005546954 systemd[1]: Started Session 6 of User zuul.
Dec  5 06:54:35 np0005546954 python3[8095]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec  5 06:54:56 np0005546954 kernel: SELinux:  Converting 385 SID table entries...
Dec  5 06:54:56 np0005546954 kernel: SELinux:  policy capability network_peer_controls=1
Dec  5 06:54:56 np0005546954 kernel: SELinux:  policy capability open_perms=1
Dec  5 06:54:56 np0005546954 kernel: SELinux:  policy capability extended_socket_class=1
Dec  5 06:54:56 np0005546954 kernel: SELinux:  policy capability always_check_network=0
Dec  5 06:54:56 np0005546954 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  5 06:54:56 np0005546954 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  5 06:54:56 np0005546954 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  5 06:55:09 np0005546954 kernel: SELinux:  Converting 385 SID table entries...
Dec  5 06:55:09 np0005546954 kernel: SELinux:  policy capability network_peer_controls=1
Dec  5 06:55:09 np0005546954 kernel: SELinux:  policy capability open_perms=1
Dec  5 06:55:09 np0005546954 kernel: SELinux:  policy capability extended_socket_class=1
Dec  5 06:55:09 np0005546954 kernel: SELinux:  policy capability always_check_network=0
Dec  5 06:55:09 np0005546954 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  5 06:55:09 np0005546954 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  5 06:55:09 np0005546954 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  5 06:55:21 np0005546954 kernel: SELinux:  Converting 385 SID table entries...
Dec  5 06:55:21 np0005546954 kernel: SELinux:  policy capability network_peer_controls=1
Dec  5 06:55:21 np0005546954 kernel: SELinux:  policy capability open_perms=1
Dec  5 06:55:21 np0005546954 kernel: SELinux:  policy capability extended_socket_class=1
Dec  5 06:55:21 np0005546954 kernel: SELinux:  policy capability always_check_network=0
Dec  5 06:55:21 np0005546954 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  5 06:55:21 np0005546954 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  5 06:55:21 np0005546954 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  5 06:55:23 np0005546954 setsebool[8165]: The virt_use_nfs policy boolean was changed to 1 by root
Dec  5 06:55:23 np0005546954 setsebool[8165]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Dec  5 06:55:36 np0005546954 kernel: SELinux:  Converting 388 SID table entries...
Dec  5 06:55:36 np0005546954 kernel: SELinux:  policy capability network_peer_controls=1
Dec  5 06:55:36 np0005546954 kernel: SELinux:  policy capability open_perms=1
Dec  5 06:55:36 np0005546954 kernel: SELinux:  policy capability extended_socket_class=1
Dec  5 06:55:36 np0005546954 kernel: SELinux:  policy capability always_check_network=0
Dec  5 06:55:36 np0005546954 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  5 06:55:36 np0005546954 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  5 06:55:36 np0005546954 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  5 06:55:55 np0005546954 dbus-broker-launch[769]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Dec  5 06:55:55 np0005546954 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  5 06:55:55 np0005546954 systemd[1]: Starting man-db-cache-update.service...
Dec  5 06:55:55 np0005546954 systemd[1]: Reloading.
Dec  5 06:55:55 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:55:55 np0005546954 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  5 06:56:00 np0005546954 python3[12594]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"#012 _uses_shell=True zuul_log_id=fa163efc-24cc-ac42-d21a-00000000000b-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:56:01 np0005546954 kernel: evm: overlay not supported
Dec  5 06:56:01 np0005546954 systemd[4296]: Starting D-Bus User Message Bus...
Dec  5 06:56:01 np0005546954 dbus-broker-launch[13505]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Dec  5 06:56:01 np0005546954 dbus-broker-launch[13505]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Dec  5 06:56:01 np0005546954 dbus-broker-lau[13505]: Ready
Dec  5 06:56:01 np0005546954 systemd[4296]: Started D-Bus User Message Bus.
Dec  5 06:56:01 np0005546954 systemd[4296]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Dec  5 06:56:01 np0005546954 systemd[4296]: Created slice Slice /user.
Dec  5 06:56:01 np0005546954 systemd[4296]: podman-13363.scope: unit configures an IP firewall, but not running as root.
Dec  5 06:56:01 np0005546954 systemd[4296]: (This warning is only shown for the first unit using IP firewalling.)
Dec  5 06:56:01 np0005546954 systemd[4296]: Started podman-13363.scope.
Dec  5 06:56:01 np0005546954 systemd[4296]: Started podman-pause-9121fe6e.scope.
Dec  5 06:56:02 np0005546954 python3[13929]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]#012location = "38.102.83.70:5001"#012insecure = true path=/etc/containers/registries.conf block=[[registry]]#012location = "38.102.83.70:5001"#012insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:56:02 np0005546954 python3[13929]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Dec  5 06:56:03 np0005546954 systemd[1]: session-6.scope: Deactivated successfully.
Dec  5 06:56:03 np0005546954 systemd[1]: session-6.scope: Consumed 1min 9.303s CPU time.
Dec  5 06:56:03 np0005546954 systemd-logind[789]: Session 6 logged out. Waiting for processes to exit.
Dec  5 06:56:03 np0005546954 systemd-logind[789]: Removed session 6.
Dec  5 06:56:33 np0005546954 systemd-logind[789]: New session 7 of user zuul.
Dec  5 06:56:33 np0005546954 systemd[1]: Started Session 7 of User zuul.
Dec  5 06:56:33 np0005546954 python3[24904]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKkGwGidW0Qw6tEKJfJ+duXdqHj4Sp2Z15V0x2e3CA0eYJhJTVRKybt47r/2oyPyJP9ozrvFEYAkmJbyY8RKZ2s= zuul@np0005546952.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:56:34 np0005546954 python3[25103]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKkGwGidW0Qw6tEKJfJ+duXdqHj4Sp2Z15V0x2e3CA0eYJhJTVRKybt47r/2oyPyJP9ozrvFEYAkmJbyY8RKZ2s= zuul@np0005546952.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:56:34 np0005546954 python3[25449]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005546954.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Dec  5 06:56:35 np0005546954 python3[25749]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKkGwGidW0Qw6tEKJfJ+duXdqHj4Sp2Z15V0x2e3CA0eYJhJTVRKybt47r/2oyPyJP9ozrvFEYAkmJbyY8RKZ2s= zuul@np0005546952.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:56:36 np0005546954 python3[25997]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  5 06:56:36 np0005546954 python3[26265]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935795.741988-152-252393829918706/source _original_basename=tmpgzmbbryi follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:56:37 np0005546954 python3[26584]: ansible-ansible.builtin.hostname Invoked with name=compute-1 use=systemd
Dec  5 06:56:37 np0005546954 systemd[1]: Starting Hostname Service...
Dec  5 06:56:37 np0005546954 systemd[1]: Started Hostname Service.
Dec  5 06:56:37 np0005546954 systemd-hostnamed[26684]: Changed pretty hostname to 'compute-1'
Dec  5 06:56:37 np0005546954 systemd-hostnamed[26684]: Hostname set to <compute-1> (static)
Dec  5 06:56:37 np0005546954 NetworkManager[7188]: <info>  [1764935797.5769] hostname: static hostname changed from "np0005546954.novalocal" to "compute-1"
Dec  5 06:56:37 np0005546954 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  5 06:56:37 np0005546954 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  5 06:56:37 np0005546954 systemd[1]: session-7.scope: Deactivated successfully.
Dec  5 06:56:37 np0005546954 systemd[1]: session-7.scope: Consumed 2.407s CPU time.
Dec  5 06:56:37 np0005546954 systemd-logind[789]: Session 7 logged out. Waiting for processes to exit.
Dec  5 06:56:37 np0005546954 systemd-logind[789]: Removed session 7.
Dec  5 06:56:47 np0005546954 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  5 06:56:48 np0005546954 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  5 06:56:48 np0005546954 systemd[1]: Finished man-db-cache-update.service.
Dec  5 06:56:48 np0005546954 systemd[1]: man-db-cache-update.service: Consumed 1min 910ms CPU time.
Dec  5 06:56:48 np0005546954 systemd[1]: run-rdaf5f49df3174fae92d007f297ab4234.service: Deactivated successfully.
Dec  5 06:57:07 np0005546954 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec  5 06:58:45 np0005546954 systemd[1]: Starting Cleanup of Temporary Directories...
Dec  5 06:58:45 np0005546954 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Dec  5 06:58:45 np0005546954 systemd[1]: Finished Cleanup of Temporary Directories.
Dec  5 06:58:45 np0005546954 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Dec  5 07:00:46 np0005546954 systemd-logind[789]: New session 8 of user zuul.
Dec  5 07:00:46 np0005546954 systemd[1]: Started Session 8 of User zuul.
Dec  5 07:00:46 np0005546954 python3[30058]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 07:00:48 np0005546954 python3[30174]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  5 07:00:49 np0005546954 python3[30247]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764936048.6626952-33797-155015880112511/source mode=0755 _original_basename=delorean.repo follow=False checksum=39c885eb875fd03e010d1b0454241c26b121dfb2 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:00:49 np0005546954 python3[30273]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  5 07:00:49 np0005546954 python3[30346]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764936048.6626952-33797-155015880112511/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=4ebc56dead962b5d40b8d420dad43b948b84d3fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:00:50 np0005546954 python3[30372]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  5 07:00:50 np0005546954 python3[30445]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764936048.6626952-33797-155015880112511/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:00:51 np0005546954 python3[30471]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  5 07:00:51 np0005546954 python3[30544]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764936048.6626952-33797-155015880112511/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:00:51 np0005546954 python3[30570]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  5 07:00:52 np0005546954 python3[30643]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764936048.6626952-33797-155015880112511/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:00:52 np0005546954 python3[30669]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  5 07:00:52 np0005546954 python3[30742]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764936048.6626952-33797-155015880112511/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:00:52 np0005546954 python3[30768]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  5 07:00:53 np0005546954 python3[30841]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764936048.6626952-33797-155015880112511/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=6e18e2038d54303b4926db53c0b6cced515a9151 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:02:09 np0005546954 python3[30905]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:07:09 np0005546954 systemd[1]: session-8.scope: Deactivated successfully.
Dec  5 07:07:09 np0005546954 systemd[1]: session-8.scope: Consumed 5.087s CPU time.
Dec  5 07:07:09 np0005546954 systemd-logind[789]: Session 8 logged out. Waiting for processes to exit.
Dec  5 07:07:09 np0005546954 systemd-logind[789]: Removed session 8.
Dec  5 07:10:45 np0005546954 systemd[1]: Starting dnf makecache...
Dec  5 07:10:45 np0005546954 dnf[30913]: Failed determining last makecache time.
Dec  5 07:10:45 np0005546954 dnf[30913]: delorean-openstack-barbican-42b4c41831408a8e323 323 kB/s |  13 kB     00:00
Dec  5 07:10:45 np0005546954 dnf[30913]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 2.4 MB/s |  65 kB     00:00
Dec  5 07:10:45 np0005546954 dnf[30913]: delorean-openstack-cinder-1c00d6490d88e436f26ef 1.2 MB/s |  32 kB     00:00
Dec  5 07:10:45 np0005546954 dnf[30913]: delorean-python-stevedore-c4acc5639fd2329372142 5.0 MB/s | 131 kB     00:00
Dec  5 07:10:45 np0005546954 dnf[30913]: delorean-python-cloudkitty-tests-tempest-2c80f8 1.3 MB/s |  32 kB     00:00
Dec  5 07:10:46 np0005546954 dnf[30913]: delorean-os-net-config-d0cedbdb788d43e5c7551df5  12 MB/s | 349 kB     00:00
Dec  5 07:10:46 np0005546954 dnf[30913]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 1.3 MB/s |  42 kB     00:00
Dec  5 07:10:46 np0005546954 dnf[30913]: delorean-python-designate-tests-tempest-347fdbc 764 kB/s |  18 kB     00:00
Dec  5 07:10:46 np0005546954 dnf[30913]: delorean-openstack-glance-1fd12c29b339f30fe823e 781 kB/s |  18 kB     00:00
Dec  5 07:10:46 np0005546954 dnf[30913]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 1.0 MB/s |  29 kB     00:00
Dec  5 07:10:46 np0005546954 dnf[30913]: delorean-openstack-manila-3c01b7181572c95dac462 968 kB/s |  25 kB     00:00
Dec  5 07:10:46 np0005546954 dnf[30913]: delorean-python-whitebox-neutron-tests-tempest- 5.6 MB/s | 154 kB     00:00
Dec  5 07:10:46 np0005546954 dnf[30913]: delorean-openstack-octavia-ba397f07a7331190208c 1.1 MB/s |  26 kB     00:00
Dec  5 07:10:46 np0005546954 dnf[30913]: delorean-openstack-watcher-c014f81a8647287f6dcc 679 kB/s |  16 kB     00:00
Dec  5 07:10:46 np0005546954 dnf[30913]: delorean-ansible-config_template-5ccaa22121a7ff 326 kB/s | 7.4 kB     00:00
Dec  5 07:10:46 np0005546954 dnf[30913]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 5.1 MB/s | 144 kB     00:00
Dec  5 07:10:46 np0005546954 dnf[30913]: delorean-openstack-swift-dc98a8463506ac520c469a 584 kB/s |  14 kB     00:00
Dec  5 07:10:46 np0005546954 dnf[30913]: delorean-python-tempestconf-8515371b7cceebd4282 2.1 MB/s |  53 kB     00:00
Dec  5 07:10:46 np0005546954 dnf[30913]: delorean-openstack-heat-ui-013accbfd179753bc3f0 3.6 MB/s |  96 kB     00:00
Dec  5 07:10:46 np0005546954 dnf[30913]: CentOS Stream 9 - BaseOS                         39 kB/s | 7.3 kB     00:00
Dec  5 07:10:47 np0005546954 dnf[30913]: CentOS Stream 9 - AppStream                      33 kB/s | 7.4 kB     00:00
Dec  5 07:10:47 np0005546954 dnf[30913]: CentOS Stream 9 - CRB                            81 kB/s | 7.2 kB     00:00
Dec  5 07:10:47 np0005546954 dnf[30913]: CentOS Stream 9 - Extras packages                77 kB/s | 8.3 kB     00:00
Dec  5 07:10:47 np0005546954 dnf[30913]: dlrn-antelope-testing                            11 MB/s | 1.1 MB     00:00
Dec  5 07:10:48 np0005546954 dnf[30913]: dlrn-antelope-build-deps                        4.1 MB/s | 461 kB     00:00
Dec  5 07:10:48 np0005546954 dnf[30913]: centos9-rabbitmq                                622 kB/s | 123 kB     00:00
Dec  5 07:10:48 np0005546954 dnf[30913]: centos9-storage                                 1.7 MB/s | 415 kB     00:00
Dec  5 07:10:49 np0005546954 dnf[30913]: centos9-opstools                                406 kB/s |  51 kB     00:00
Dec  5 07:10:49 np0005546954 dnf[30913]: NFV SIG OpenvSwitch                             4.0 MB/s | 456 kB     00:00
Dec  5 07:10:50 np0005546954 dnf[30913]: repo-setup-centos-appstream                      61 MB/s |  25 MB     00:00
Dec  5 07:10:57 np0005546954 dnf[30913]: repo-setup-centos-baseos                         22 MB/s | 8.8 MB     00:00
Dec  5 07:10:59 np0005546954 dnf[30913]: repo-setup-centos-highavailability              9.8 MB/s | 744 kB     00:00
Dec  5 07:10:59 np0005546954 dnf[30913]: repo-setup-centos-powertools                     48 MB/s | 7.3 MB     00:00
Dec  5 07:11:03 np0005546954 dnf[30913]: Extra Packages for Enterprise Linux 9 - x86_64   11 MB/s |  20 MB     00:01
Dec  5 07:11:25 np0005546954 dnf[30913]: Metadata cache created.
Dec  5 07:11:25 np0005546954 systemd[1]: dnf-makecache.service: Deactivated successfully.
Dec  5 07:11:25 np0005546954 systemd[1]: Finished dnf makecache.
Dec  5 07:11:25 np0005546954 systemd[1]: dnf-makecache.service: Consumed 35.466s CPU time.
Dec  5 07:14:54 np0005546954 systemd-logind[789]: New session 9 of user zuul.
Dec  5 07:14:54 np0005546954 systemd[1]: Started Session 9 of User zuul.
Dec  5 07:14:55 np0005546954 python3.9[31172]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 07:14:57 np0005546954 python3.9[31353]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:15:05 np0005546954 systemd[1]: session-9.scope: Deactivated successfully.
Dec  5 07:15:05 np0005546954 systemd[1]: session-9.scope: Consumed 9.171s CPU time.
Dec  5 07:15:05 np0005546954 systemd-logind[789]: Session 9 logged out. Waiting for processes to exit.
Dec  5 07:15:05 np0005546954 systemd-logind[789]: Removed session 9.
Dec  5 07:15:11 np0005546954 systemd-logind[789]: New session 10 of user zuul.
Dec  5 07:15:11 np0005546954 systemd[1]: Started Session 10 of User zuul.
Dec  5 07:15:12 np0005546954 python3.9[31564]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 07:15:12 np0005546954 systemd[1]: session-10.scope: Deactivated successfully.
Dec  5 07:15:12 np0005546954 systemd-logind[789]: Session 10 logged out. Waiting for processes to exit.
Dec  5 07:15:12 np0005546954 systemd-logind[789]: Removed session 10.
Dec  5 07:15:28 np0005546954 systemd-logind[789]: New session 11 of user zuul.
Dec  5 07:15:28 np0005546954 systemd[1]: Started Session 11 of User zuul.
Dec  5 07:15:29 np0005546954 python3.9[31744]: ansible-ansible.legacy.ping Invoked with data=pong
Dec  5 07:15:30 np0005546954 python3.9[31918]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 07:15:31 np0005546954 python3.9[32070]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:15:32 np0005546954 python3.9[32223]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 07:15:33 np0005546954 python3.9[32375]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:15:34 np0005546954 python3.9[32527]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:15:35 np0005546954 python3.9[32650]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764936933.6472483-126-163698772417572/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:15:35 np0005546954 python3.9[32802]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 07:15:36 np0005546954 python3.9[32958]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:15:37 np0005546954 python3.9[33110]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:15:38 np0005546954 python3.9[33260]: ansible-ansible.builtin.service_facts Invoked
Dec  5 07:15:42 np0005546954 python3.9[33513]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:15:43 np0005546954 python3.9[33663]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 07:15:44 np0005546954 python3.9[33817]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 07:15:45 np0005546954 python3.9[33975]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  5 07:15:46 np0005546954 python3.9[34059]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  5 07:16:32 np0005546954 systemd[1]: Reloading.
Dec  5 07:16:32 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:16:33 np0005546954 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Dec  5 07:16:33 np0005546954 systemd[1]: Reloading.
Dec  5 07:16:33 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:16:33 np0005546954 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Dec  5 07:16:33 np0005546954 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Dec  5 07:16:33 np0005546954 systemd[1]: Reloading.
Dec  5 07:16:33 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:16:34 np0005546954 systemd[1]: Listening on LVM2 poll daemon socket.
Dec  5 07:16:34 np0005546954 dbus-broker-launch[738]: Noticed file-system modification, trigger reload.
Dec  5 07:16:34 np0005546954 dbus-broker-launch[738]: Noticed file-system modification, trigger reload.
Dec  5 07:16:34 np0005546954 dbus-broker-launch[738]: Noticed file-system modification, trigger reload.
Dec  5 07:17:46 np0005546954 kernel: SELinux:  Converting 2719 SID table entries...
Dec  5 07:17:46 np0005546954 kernel: SELinux:  policy capability network_peer_controls=1
Dec  5 07:17:46 np0005546954 kernel: SELinux:  policy capability open_perms=1
Dec  5 07:17:46 np0005546954 kernel: SELinux:  policy capability extended_socket_class=1
Dec  5 07:17:46 np0005546954 kernel: SELinux:  policy capability always_check_network=0
Dec  5 07:17:46 np0005546954 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  5 07:17:46 np0005546954 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  5 07:17:46 np0005546954 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  5 07:17:46 np0005546954 dbus-broker-launch[769]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Dec  5 07:17:46 np0005546954 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  5 07:17:46 np0005546954 systemd[1]: Starting man-db-cache-update.service...
Dec  5 07:17:46 np0005546954 systemd[1]: Reloading.
Dec  5 07:17:46 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:17:46 np0005546954 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  5 07:17:48 np0005546954 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  5 07:17:48 np0005546954 systemd[1]: Finished man-db-cache-update.service.
Dec  5 07:17:48 np0005546954 systemd[1]: man-db-cache-update.service: Consumed 1.369s CPU time.
Dec  5 07:17:48 np0005546954 systemd[1]: run-rf38bf277d1254d1b93e400120ee53724.service: Deactivated successfully.
Dec  5 07:17:49 np0005546954 python3.9[35594]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:17:51 np0005546954 python3.9[35875]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Dec  5 07:17:52 np0005546954 python3.9[36027]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Dec  5 07:17:56 np0005546954 python3.9[36180]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:17:57 np0005546954 python3.9[36332]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Dec  5 07:18:01 np0005546954 python3.9[36484]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:18:07 np0005546954 python3.9[36636]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:18:08 np0005546954 python3.9[36759]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937081.8105955-452-11926159327759/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=59ae341700d0fc55e2ef1fdd1f1ac8c51deabc0e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:18:09 np0005546954 python3.9[36911]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 07:18:09 np0005546954 python3.9[37063]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:18:10 np0005546954 python3.9[37216]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:18:11 np0005546954 python3.9[37368]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Dec  5 07:18:11 np0005546954 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  5 07:18:11 np0005546954 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  5 07:18:12 np0005546954 python3.9[37522]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  5 07:18:13 np0005546954 python3.9[37680]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec  5 07:18:14 np0005546954 python3.9[37840]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Dec  5 07:18:15 np0005546954 python3.9[37993]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  5 07:18:16 np0005546954 python3.9[38151]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Dec  5 07:18:17 np0005546954 python3.9[38303]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  5 07:18:21 np0005546954 python3.9[38456]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:18:22 np0005546954 python3.9[38608]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:18:23 np0005546954 python3.9[38731]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764937102.2106588-690-184019124919493/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:18:24 np0005546954 python3.9[38883]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  5 07:18:24 np0005546954 systemd[1]: Starting Load Kernel Modules...
Dec  5 07:18:24 np0005546954 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Dec  5 07:18:24 np0005546954 kernel: Bridge firewalling registered
Dec  5 07:18:24 np0005546954 systemd-modules-load[38887]: Inserted module 'br_netfilter'
Dec  5 07:18:24 np0005546954 systemd[1]: Finished Load Kernel Modules.
Dec  5 07:18:25 np0005546954 python3.9[39043]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:18:25 np0005546954 python3.9[39166]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764937104.6958292-736-140051156471032/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:18:26 np0005546954 python3.9[39318]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  5 07:18:30 np0005546954 dbus-broker-launch[738]: Noticed file-system modification, trigger reload.
Dec  5 07:18:30 np0005546954 dbus-broker-launch[738]: Noticed file-system modification, trigger reload.
Dec  5 07:18:30 np0005546954 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  5 07:18:30 np0005546954 systemd[1]: Starting man-db-cache-update.service...
Dec  5 07:18:30 np0005546954 systemd[1]: Reloading.
Dec  5 07:18:30 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:18:30 np0005546954 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  5 07:18:32 np0005546954 python3.9[40774]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 07:18:33 np0005546954 python3.9[41962]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Dec  5 07:18:33 np0005546954 python3.9[42755]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 07:18:34 np0005546954 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  5 07:18:34 np0005546954 systemd[1]: Finished man-db-cache-update.service.
Dec  5 07:18:34 np0005546954 systemd[1]: man-db-cache-update.service: Consumed 4.917s CPU time.
Dec  5 07:18:34 np0005546954 systemd[1]: run-rc7003aecc0974de9b62d81b4aba0964a.service: Deactivated successfully.
Dec  5 07:18:34 np0005546954 python3.9[43514]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:18:34 np0005546954 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec  5 07:18:35 np0005546954 systemd[1]: Starting Authorization Manager...
Dec  5 07:18:35 np0005546954 systemd[1]: Started Dynamic System Tuning Daemon.
Dec  5 07:18:35 np0005546954 polkitd[43732]: Started polkitd version 0.117
Dec  5 07:18:35 np0005546954 systemd[1]: Started Authorization Manager.
Dec  5 07:18:36 np0005546954 python3.9[43902]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 07:18:36 np0005546954 systemd[1]: Stopping Dynamic System Tuning Daemon...
Dec  5 07:18:36 np0005546954 systemd[1]: tuned.service: Deactivated successfully.
Dec  5 07:18:36 np0005546954 systemd[1]: Stopped Dynamic System Tuning Daemon.
Dec  5 07:18:36 np0005546954 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec  5 07:18:36 np0005546954 systemd[1]: Started Dynamic System Tuning Daemon.
Dec  5 07:18:37 np0005546954 python3.9[44063]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Dec  5 07:18:39 np0005546954 python3.9[44215]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 07:18:39 np0005546954 systemd[1]: Reloading.
Dec  5 07:18:39 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:18:40 np0005546954 python3.9[44404]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 07:18:40 np0005546954 systemd[1]: Reloading.
Dec  5 07:18:40 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:18:41 np0005546954 python3.9[44593]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:18:42 np0005546954 python3.9[44746]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:18:42 np0005546954 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Dec  5 07:18:43 np0005546954 python3.9[44899]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:18:45 np0005546954 python3.9[45061]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:18:46 np0005546954 python3.9[45214]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  5 07:18:46 np0005546954 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec  5 07:18:46 np0005546954 systemd[1]: Stopped Apply Kernel Variables.
Dec  5 07:18:46 np0005546954 systemd[1]: Stopping Apply Kernel Variables...
Dec  5 07:18:46 np0005546954 systemd[1]: Starting Apply Kernel Variables...
Dec  5 07:18:46 np0005546954 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec  5 07:18:46 np0005546954 systemd[1]: Finished Apply Kernel Variables.
Dec  5 07:18:46 np0005546954 systemd-logind[789]: Session 11 logged out. Waiting for processes to exit.
Dec  5 07:18:46 np0005546954 systemd[1]: session-11.scope: Deactivated successfully.
Dec  5 07:18:46 np0005546954 systemd[1]: session-11.scope: Consumed 2min 23.812s CPU time.
Dec  5 07:18:46 np0005546954 systemd-logind[789]: Removed session 11.
Dec  5 07:18:52 np0005546954 systemd-logind[789]: New session 12 of user zuul.
Dec  5 07:18:52 np0005546954 systemd[1]: Started Session 12 of User zuul.
Dec  5 07:18:53 np0005546954 python3.9[45398]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 07:18:54 np0005546954 python3.9[45552]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 07:18:55 np0005546954 python3.9[45708]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:18:56 np0005546954 python3.9[45859]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 07:18:57 np0005546954 python3.9[46015]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  5 07:18:58 np0005546954 python3.9[46099]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  5 07:19:00 np0005546954 python3.9[46252]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  5 07:19:01 np0005546954 python3.9[46423]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:19:02 np0005546954 python3.9[46575]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:19:02 np0005546954 podman[46576]: 2025-12-05 12:19:02.439886158 +0000 UTC m=+0.057512664 system refresh
Dec  5 07:19:03 np0005546954 python3.9[46739]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:19:03 np0005546954 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 07:19:03 np0005546954 python3.9[46862]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937142.6500921-199-262093562373539/.source.json follow=False _original_basename=podman_network_config.j2 checksum=8de1c948572dbb85b9d5f998430f3c2f8e7b0b95 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:19:04 np0005546954 python3.9[47014]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:19:05 np0005546954 python3.9[47137]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764937144.07047-229-233440724800320/.source.conf follow=False _original_basename=registries.conf.j2 checksum=a4d0af73e82956a82115da1152ffa584e292554a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:19:05 np0005546954 python3.9[47289]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:19:06 np0005546954 python3.9[47441]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:19:07 np0005546954 python3.9[47593]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:19:07 np0005546954 python3.9[47745]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:19:08 np0005546954 python3.9[47895]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 07:19:09 np0005546954 python3.9[48049]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  5 07:19:11 np0005546954 python3.9[48202]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  5 07:19:14 np0005546954 python3.9[48362]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  5 07:19:16 np0005546954 python3.9[48515]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  5 07:19:19 np0005546954 python3.9[48668]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  5 07:19:21 np0005546954 python3.9[48824]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  5 07:19:25 np0005546954 python3.9[48993]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  5 07:19:27 np0005546954 python3.9[49146]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  5 07:19:41 np0005546954 python3.9[49482]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  5 07:19:43 np0005546954 python3.9[49638]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:19:44 np0005546954 python3.9[49813]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:19:44 np0005546954 python3.9[49936]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1764937183.9105854-525-214301845922200/.source.json _original_basename=.sljn7i7e follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:19:46 np0005546954 python3.9[50088]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec  5 07:19:46 np0005546954 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 07:19:48 np0005546954 systemd[1]: var-lib-containers-storage-overlay-compat770190096-lower\x2dmapped.mount: Deactivated successfully.
Dec  5 07:19:51 np0005546954 podman[50101]: 2025-12-05 12:19:51.658619449 +0000 UTC m=+5.424714691 image pull 3a37a52861b2e44ebd2a63ca2589a7c9d8e4119e5feace9d19c6312ed9b8421c quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec  5 07:19:51 np0005546954 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 07:19:51 np0005546954 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 07:19:51 np0005546954 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 07:19:56 np0005546954 python3.9[50397]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec  5 07:19:56 np0005546954 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 07:20:07 np0005546954 podman[50409]: 2025-12-05 12:20:07.543383144 +0000 UTC m=+11.306047105 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:20:07 np0005546954 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 07:20:07 np0005546954 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 07:20:07 np0005546954 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 07:20:11 np0005546954 python3.9[50706]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec  5 07:20:11 np0005546954 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 07:20:12 np0005546954 podman[50719]: 2025-12-05 12:20:12.979406761 +0000 UTC m=+1.471583612 image pull 9af6aa52ee187025bc25565b66d3eefb486acac26f9281e33f4cce76a40d21f7 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec  5 07:20:12 np0005546954 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 07:20:13 np0005546954 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 07:20:13 np0005546954 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 07:20:13 np0005546954 python3.9[50951]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec  5 07:20:30 np0005546954 podman[50963]: 2025-12-05 12:20:30.091238841 +0000 UTC m=+16.139173712 image pull 5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec  5 07:20:30 np0005546954 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 07:20:30 np0005546954 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 07:20:30 np0005546954 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 07:20:35 np0005546954 python3.9[51229]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec  5 07:20:35 np0005546954 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 07:20:39 np0005546954 podman[51241]: 2025-12-05 12:20:39.506860766 +0000 UTC m=+3.518796436 image pull 343ba269c9fe0a56d7572c8ca328dbce002017c4dd4986f43667971dd03085c2 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Dec  5 07:20:39 np0005546954 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 07:20:39 np0005546954 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 07:20:39 np0005546954 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 07:20:40 np0005546954 python3.9[51499]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec  5 07:20:41 np0005546954 podman[51511]: 2025-12-05 12:20:41.645443175 +0000 UTC m=+1.287891443 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Dec  5 07:20:41 np0005546954 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 07:20:41 np0005546954 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 07:20:41 np0005546954 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 07:20:42 np0005546954 systemd[1]: session-12.scope: Deactivated successfully.
Dec  5 07:20:42 np0005546954 systemd[1]: session-12.scope: Consumed 1min 46.783s CPU time.
Dec  5 07:20:42 np0005546954 systemd-logind[789]: Session 12 logged out. Waiting for processes to exit.
Dec  5 07:20:42 np0005546954 systemd-logind[789]: Removed session 12.
Dec  5 07:20:47 np0005546954 systemd-logind[789]: New session 13 of user zuul.
Dec  5 07:20:47 np0005546954 systemd[1]: Started Session 13 of User zuul.
Dec  5 07:20:48 np0005546954 python3.9[51815]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 07:20:49 np0005546954 python3.9[51971]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Dec  5 07:20:50 np0005546954 python3.9[52124]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  5 07:20:51 np0005546954 python3.9[52282]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec  5 07:20:52 np0005546954 python3.9[52442]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  5 07:20:53 np0005546954 python3.9[52526]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  5 07:20:56 np0005546954 python3.9[52688]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  5 07:21:12 np0005546954 kernel: SELinux:  Converting 2732 SID table entries...
Dec  5 07:21:12 np0005546954 kernel: SELinux:  policy capability network_peer_controls=1
Dec  5 07:21:12 np0005546954 kernel: SELinux:  policy capability open_perms=1
Dec  5 07:21:12 np0005546954 kernel: SELinux:  policy capability extended_socket_class=1
Dec  5 07:21:12 np0005546954 kernel: SELinux:  policy capability always_check_network=0
Dec  5 07:21:12 np0005546954 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  5 07:21:12 np0005546954 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  5 07:21:12 np0005546954 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  5 07:21:12 np0005546954 dbus-broker-launch[769]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Dec  5 07:21:12 np0005546954 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Dec  5 07:21:13 np0005546954 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  5 07:21:13 np0005546954 systemd[1]: Starting man-db-cache-update.service...
Dec  5 07:21:13 np0005546954 systemd[1]: Reloading.
Dec  5 07:21:13 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:21:13 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:21:14 np0005546954 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  5 07:21:14 np0005546954 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  5 07:21:14 np0005546954 systemd[1]: Finished man-db-cache-update.service.
Dec  5 07:21:14 np0005546954 systemd[1]: run-rca3a6bd4eae3413c8eb354d44b1eb90a.service: Deactivated successfully.
Dec  5 07:21:15 np0005546954 python3.9[53787]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  5 07:21:15 np0005546954 systemd[1]: Reloading.
Dec  5 07:21:15 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:21:15 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:21:16 np0005546954 systemd[1]: Starting Open vSwitch Database Unit...
Dec  5 07:21:16 np0005546954 chown[53828]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Dec  5 07:21:16 np0005546954 ovs-ctl[53833]: /etc/openvswitch/conf.db does not exist ... (warning).
Dec  5 07:21:16 np0005546954 ovs-ctl[53833]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Dec  5 07:21:16 np0005546954 ovs-ctl[53833]: Starting ovsdb-server [  OK  ]
Dec  5 07:21:16 np0005546954 ovs-vsctl[53882]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Dec  5 07:21:16 np0005546954 ovs-vsctl[53902]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"47f9f74c-08f9-451f-9678-93bb9e8fa2fe\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Dec  5 07:21:16 np0005546954 ovs-ctl[53833]: Configuring Open vSwitch system IDs [  OK  ]
Dec  5 07:21:16 np0005546954 ovs-vsctl[53908]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Dec  5 07:21:16 np0005546954 ovs-ctl[53833]: Enabling remote OVSDB managers [  OK  ]
Dec  5 07:21:16 np0005546954 systemd[1]: Started Open vSwitch Database Unit.
Dec  5 07:21:16 np0005546954 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Dec  5 07:21:16 np0005546954 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Dec  5 07:21:16 np0005546954 systemd[1]: Starting Open vSwitch Forwarding Unit...
Dec  5 07:21:16 np0005546954 kernel: openvswitch: Open vSwitch switching datapath
Dec  5 07:21:16 np0005546954 ovs-ctl[53952]: Inserting openvswitch module [  OK  ]
Dec  5 07:21:16 np0005546954 ovs-ctl[53921]: Starting ovs-vswitchd [  OK  ]
Dec  5 07:21:16 np0005546954 ovs-ctl[53921]: Enabling remote OVSDB managers [  OK  ]
Dec  5 07:21:16 np0005546954 systemd[1]: Started Open vSwitch Forwarding Unit.
Dec  5 07:21:16 np0005546954 ovs-vsctl[53969]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Dec  5 07:21:16 np0005546954 systemd[1]: Starting Open vSwitch...
Dec  5 07:21:16 np0005546954 systemd[1]: Finished Open vSwitch.
Dec  5 07:21:17 np0005546954 python3.9[54121]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 07:21:18 np0005546954 python3.9[54273]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Dec  5 07:21:19 np0005546954 kernel: SELinux:  Converting 2746 SID table entries...
Dec  5 07:21:19 np0005546954 kernel: SELinux:  policy capability network_peer_controls=1
Dec  5 07:21:19 np0005546954 kernel: SELinux:  policy capability open_perms=1
Dec  5 07:21:19 np0005546954 kernel: SELinux:  policy capability extended_socket_class=1
Dec  5 07:21:19 np0005546954 kernel: SELinux:  policy capability always_check_network=0
Dec  5 07:21:19 np0005546954 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  5 07:21:19 np0005546954 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  5 07:21:19 np0005546954 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  5 07:21:20 np0005546954 python3.9[54428]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 07:21:21 np0005546954 dbus-broker-launch[769]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Dec  5 07:21:21 np0005546954 python3.9[54586]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  5 07:21:23 np0005546954 python3.9[54739]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:21:25 np0005546954 python3.9[55026]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec  5 07:21:26 np0005546954 python3.9[55176]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 07:21:27 np0005546954 python3.9[55330]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  5 07:21:29 np0005546954 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  5 07:21:29 np0005546954 systemd[1]: Starting man-db-cache-update.service...
Dec  5 07:21:29 np0005546954 systemd[1]: Reloading.
Dec  5 07:21:29 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:21:29 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:21:29 np0005546954 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  5 07:21:30 np0005546954 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  5 07:21:30 np0005546954 systemd[1]: Finished man-db-cache-update.service.
Dec  5 07:21:30 np0005546954 systemd[1]: run-r7687d9cad58445cba28b3f8f9046b811.service: Deactivated successfully.
Dec  5 07:21:30 np0005546954 python3.9[55647]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  5 07:21:31 np0005546954 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Dec  5 07:21:31 np0005546954 systemd[1]: Stopped Network Manager Wait Online.
Dec  5 07:21:31 np0005546954 systemd[1]: Stopping Network Manager Wait Online...
Dec  5 07:21:31 np0005546954 systemd[1]: Stopping Network Manager...
Dec  5 07:21:31 np0005546954 NetworkManager[7188]: <info>  [1764937291.0014] caught SIGTERM, shutting down normally.
Dec  5 07:21:31 np0005546954 NetworkManager[7188]: <info>  [1764937291.0031] dhcp4 (eth0): canceled DHCP transaction
Dec  5 07:21:31 np0005546954 NetworkManager[7188]: <info>  [1764937291.0032] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  5 07:21:31 np0005546954 NetworkManager[7188]: <info>  [1764937291.0032] dhcp4 (eth0): state changed no lease
Dec  5 07:21:31 np0005546954 NetworkManager[7188]: <info>  [1764937291.0036] manager: NetworkManager state is now CONNECTED_SITE
Dec  5 07:21:31 np0005546954 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  5 07:21:31 np0005546954 NetworkManager[7188]: <info>  [1764937291.0200] exiting (success)
Dec  5 07:21:31 np0005546954 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  5 07:21:31 np0005546954 systemd[1]: NetworkManager.service: Deactivated successfully.
Dec  5 07:21:31 np0005546954 systemd[1]: Stopped Network Manager.
Dec  5 07:21:31 np0005546954 systemd[1]: NetworkManager.service: Consumed 16.356s CPU time, 4.1M memory peak, read 0B from disk, written 17.5K to disk.
Dec  5 07:21:31 np0005546954 systemd[1]: Starting Network Manager...
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.0945] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:46eaa6a6-b96c-4b3b-a171-c5d47450a30e)
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.0946] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.1010] manager[0x5623a17c1090]: monitoring kernel firmware directory '/lib/firmware'.
Dec  5 07:21:31 np0005546954 systemd[1]: Starting Hostname Service...
Dec  5 07:21:31 np0005546954 systemd[1]: Started Hostname Service.
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.1764] hostname: hostname: using hostnamed
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.1765] hostname: static hostname changed from (none) to "compute-1"
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.1777] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.1786] manager[0x5623a17c1090]: rfkill: Wi-Fi hardware radio set enabled
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.1787] manager[0x5623a17c1090]: rfkill: WWAN hardware radio set enabled
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.1823] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.1837] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.1838] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.1839] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.1840] manager: Networking is enabled by state file
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.1843] settings: Loaded settings plugin: keyfile (internal)
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.1848] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.1887] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.1907] dhcp: init: Using DHCP client 'internal'
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.1910] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.1919] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.1926] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.1940] device (lo): Activation: starting connection 'lo' (5db90527-565f-43c8-b47d-5e445792da38)
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.1949] device (eth0): carrier: link connected
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.1954] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.1960] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.1961] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.1969] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.1976] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.1984] device (eth1): carrier: link connected
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.1990] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.1999] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (561dd4dc-d770-5231-ac01-02964b3b80f6) (indicated)
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.1999] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.2007] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.2017] device (eth1): Activation: starting connection 'ci-private-network' (561dd4dc-d770-5231-ac01-02964b3b80f6)
Dec  5 07:21:31 np0005546954 systemd[1]: Started Network Manager.
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.2027] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.2048] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.2051] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.2066] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.2070] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.2076] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.2080] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.2086] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.2093] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.2104] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.2109] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.2122] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.2143] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec  5 07:21:31 np0005546954 systemd[1]: Starting Network Manager Wait Online...
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.2154] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.2159] dhcp4 (eth0): state changed new lease, address=38.102.83.243
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.2165] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.2174] device (lo): Activation: successful, device activated.
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.2188] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.2268] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.2276] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.2287] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.2293] manager: NetworkManager state is now CONNECTED_LOCAL
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.2299] device (eth1): Activation: successful, device activated.
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.2331] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.2335] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.2342] manager: NetworkManager state is now CONNECTED_SITE
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.2348] device (eth0): Activation: successful, device activated.
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.2357] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec  5 07:21:31 np0005546954 NetworkManager[55665]: <info>  [1764937291.2389] manager: startup complete
Dec  5 07:21:31 np0005546954 systemd[1]: Finished Network Manager Wait Online.
Dec  5 07:21:31 np0005546954 python3.9[55873]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  5 07:21:36 np0005546954 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  5 07:21:36 np0005546954 systemd[1]: Starting man-db-cache-update.service...
Dec  5 07:21:36 np0005546954 systemd[1]: Reloading.
Dec  5 07:21:36 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:21:36 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:21:36 np0005546954 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  5 07:21:38 np0005546954 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  5 07:21:38 np0005546954 systemd[1]: Finished man-db-cache-update.service.
Dec  5 07:21:38 np0005546954 systemd[1]: run-rb52a0b13af8b4523a7b394b36937bcc3.service: Deactivated successfully.
Dec  5 07:21:38 np0005546954 python3.9[56331]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 07:21:39 np0005546954 python3.9[56483]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:21:40 np0005546954 python3.9[56637]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:21:41 np0005546954 python3.9[56789]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:21:41 np0005546954 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  5 07:21:41 np0005546954 python3.9[56941]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:21:42 np0005546954 python3.9[57093]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:21:43 np0005546954 python3.9[57245]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:21:44 np0005546954 python3.9[57368]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764937302.7437432-439-276895974477360/.source _original_basename=._9eltgmy follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:21:44 np0005546954 python3.9[57520]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:21:45 np0005546954 python3.9[57672]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Dec  5 07:21:46 np0005546954 python3.9[57824]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:21:48 np0005546954 python3.9[58251]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Dec  5 07:21:49 np0005546954 ansible-async_wrapper.py[58426]: Invoked with j226898819187 300 /home/zuul/.ansible/tmp/ansible-tmp-1764937308.945177-571-130071415821642/AnsiballZ_edpm_os_net_config.py _
Dec  5 07:21:49 np0005546954 ansible-async_wrapper.py[58429]: Starting module and watcher
Dec  5 07:21:49 np0005546954 ansible-async_wrapper.py[58429]: Start watching 58430 (300)
Dec  5 07:21:49 np0005546954 ansible-async_wrapper.py[58430]: Start module (58430)
Dec  5 07:21:49 np0005546954 ansible-async_wrapper.py[58426]: Return async_wrapper task started.
Dec  5 07:21:49 np0005546954 python3.9[58431]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Dec  5 07:21:50 np0005546954 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Dec  5 07:21:50 np0005546954 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Dec  5 07:21:50 np0005546954 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Dec  5 07:21:50 np0005546954 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Dec  5 07:21:50 np0005546954 kernel: cfg80211: failed to load regulatory.db
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.7770] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58432 uid=0 result="success"
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.7790] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58432 uid=0 result="success"
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8353] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8355] audit: op="connection-add" uuid="30e9e5c5-b5f7-43b8-9e1e-c0f854660dae" name="br-ex-br" pid=58432 uid=0 result="success"
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8371] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8372] audit: op="connection-add" uuid="3e8f8a3a-d777-4ea4-845b-e113eef5ae6a" name="br-ex-port" pid=58432 uid=0 result="success"
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8384] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8385] audit: op="connection-add" uuid="ade6ab79-2099-4530-92c7-ada92d3021a7" name="eth1-port" pid=58432 uid=0 result="success"
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8397] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8398] audit: op="connection-add" uuid="d320b7c4-e4a0-4c6e-b43a-a1b5f45ee263" name="vlan20-port" pid=58432 uid=0 result="success"
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8412] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8413] audit: op="connection-add" uuid="399a9ac3-9ad1-482a-83fd-68f370fc47a4" name="vlan21-port" pid=58432 uid=0 result="success"
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8423] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8424] audit: op="connection-add" uuid="74e5f3a6-1407-4517-9cbb-94445fd4fd91" name="vlan22-port" pid=58432 uid=0 result="success"
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8442] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv4.dhcp-timeout,ipv4.dhcp-client-id,802-3-ethernet.mtu,ipv6.addr-gen-mode,ipv6.dhcp-timeout,ipv6.method,connection.autoconnect-priority,connection.timestamp" pid=58432 uid=0 result="success"
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8458] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8459] audit: op="connection-add" uuid="802cd73d-c44c-4ed0-8b47-dbe647a8cf7b" name="br-ex-if" pid=58432 uid=0 result="success"
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8537] audit: op="connection-update" uuid="561dd4dc-d770-5231-ac01-02964b3b80f6" name="ci-private-network" args="ipv4.never-default,ipv4.routing-rules,ipv4.dns,ipv4.method,ipv4.routes,ipv4.addresses,ovs-external-ids.data,ipv6.addr-gen-mode,ipv6.routes,ipv6.routing-rules,ipv6.dns,ipv6.method,ipv6.addresses,connection.controller,connection.timestamp,connection.slave-type,connection.port-type,connection.master,ovs-interface.type" pid=58432 uid=0 result="success"
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8557] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8558] audit: op="connection-add" uuid="601ba585-b889-4b60-906f-d9bb99610f38" name="vlan20-if" pid=58432 uid=0 result="success"
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8577] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8580] audit: op="connection-add" uuid="3de8a3bb-45de-4157-a43e-bbc2e8969735" name="vlan21-if" pid=58432 uid=0 result="success"
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8596] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8597] audit: op="connection-add" uuid="75c0c11b-606f-456c-a850-6780fc070b8b" name="vlan22-if" pid=58432 uid=0 result="success"
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8613] audit: op="connection-delete" uuid="389ad192-0e8f-3736-8c5b-2cd019e054e1" name="Wired connection 1" pid=58432 uid=0 result="success"
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8644] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8654] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8657] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (30e9e5c5-b5f7-43b8-9e1e-c0f854660dae)
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8658] audit: op="connection-activate" uuid="30e9e5c5-b5f7-43b8-9e1e-c0f854660dae" name="br-ex-br" pid=58432 uid=0 result="success"
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8659] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8665] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8669] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (3e8f8a3a-d777-4ea4-845b-e113eef5ae6a)
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8670] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8674] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8677] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (ade6ab79-2099-4530-92c7-ada92d3021a7)
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8679] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8684] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8687] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (d320b7c4-e4a0-4c6e-b43a-a1b5f45ee263)
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8688] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8693] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8695] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (399a9ac3-9ad1-482a-83fd-68f370fc47a4)
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8696] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8700] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8703] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (74e5f3a6-1407-4517-9cbb-94445fd4fd91)
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8704] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8705] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8706] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8711] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8714] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8717] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (802cd73d-c44c-4ed0-8b47-dbe647a8cf7b)
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8717] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8719] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8720] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8721] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8722] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8729] device (eth1): disconnecting for new activation request.
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8729] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8731] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8732] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8733] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8735] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8738] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8740] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (601ba585-b889-4b60-906f-d9bb99610f38)
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8741] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8743] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8745] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8745] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8747] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8750] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8753] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (3de8a3bb-45de-4157-a43e-bbc2e8969735)
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8753] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8755] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8756] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8757] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8758] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8762] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8765] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (75c0c11b-606f-456c-a850-6780fc070b8b)
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8765] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8767] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8768] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8769] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8769] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8778] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv4.dhcp-timeout,ipv4.dhcp-client-id,802-3-ethernet.mtu,ipv6.addr-gen-mode,ipv6.method,connection.autoconnect-priority" pid=58432 uid=0 result="success"
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8779] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8782] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8783] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8790] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8793] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8796] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 kernel: ovs-system: entered promiscuous mode
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8814] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8817] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8825] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8830] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8834] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8835] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 kernel: Timeout policy base is empty
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8840] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8844] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8847] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 systemd-udevd[58436]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8849] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8858] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8863] dhcp4 (eth0): canceled DHCP transaction
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8863] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8863] dhcp4 (eth0): state changed no lease
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8865] dhcp4 (eth0): activation: beginning transaction (no timeout)
Dec  5 07:21:51 np0005546954 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8890] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8893] audit: op="device-reapply" interface="eth1" ifindex=3 pid=58432 uid=0 result="fail" reason="Device is not activated"
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8899] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8935] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8939] dhcp4 (eth0): state changed new lease, address=38.102.83.243
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.8943] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9017] device (eth1): disconnecting for new activation request.
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9018] audit: op="connection-activate" uuid="561dd4dc-d770-5231-ac01-02964b3b80f6" name="ci-private-network" pid=58432 uid=0 result="success"
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9042] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58432 uid=0 result="success"
Dec  5 07:21:51 np0005546954 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9106] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Dec  5 07:21:51 np0005546954 kernel: br-ex: entered promiscuous mode
Dec  5 07:21:51 np0005546954 kernel: vlan22: entered promiscuous mode
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9279] device (eth1): Activation: starting connection 'ci-private-network' (561dd4dc-d770-5231-ac01-02964b3b80f6)
Dec  5 07:21:51 np0005546954 systemd-udevd[58437]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9296] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9302] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9320] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9323] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9339] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9340] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9342] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9343] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 kernel: vlan21: entered promiscuous mode
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9362] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9386] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9391] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9395] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9399] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Dec  5 07:21:51 np0005546954 kernel: vlan20: entered promiscuous mode
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9402] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9405] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9408] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9417] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9420] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9424] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9427] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9431] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9438] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Dec  5 07:21:51 np0005546954 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9475] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9482] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9495] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9498] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9507] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9518] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9524] device (eth1): Activation: successful, device activated.
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9542] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9561] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9565] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9570] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9579] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9585] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9598] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9609] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9611] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9614] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9621] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9626] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9632] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9637] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9707] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9709] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  5 07:21:51 np0005546954 NetworkManager[55665]: <info>  [1764937311.9714] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Dec  5 07:21:53 np0005546954 NetworkManager[55665]: <info>  [1764937313.1025] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58432 uid=0 result="success"
Dec  5 07:21:53 np0005546954 NetworkManager[55665]: <info>  [1764937313.2737] checkpoint[0x5623a1796950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Dec  5 07:21:53 np0005546954 NetworkManager[55665]: <info>  [1764937313.2739] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58432 uid=0 result="success"
Dec  5 07:21:53 np0005546954 NetworkManager[55665]: <info>  [1764937313.6009] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58432 uid=0 result="success"
Dec  5 07:21:53 np0005546954 NetworkManager[55665]: <info>  [1764937313.6024] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58432 uid=0 result="success"
Dec  5 07:21:53 np0005546954 python3.9[58765]: ansible-ansible.legacy.async_status Invoked with jid=j226898819187.58426 mode=status _async_dir=/root/.ansible_async
Dec  5 07:21:53 np0005546954 NetworkManager[55665]: <info>  [1764937313.8078] audit: op="networking-control" arg="global-dns-configuration" pid=58432 uid=0 result="success"
Dec  5 07:21:53 np0005546954 NetworkManager[55665]: <info>  [1764937313.8109] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Dec  5 07:21:53 np0005546954 NetworkManager[55665]: <info>  [1764937313.8148] audit: op="networking-control" arg="global-dns-configuration" pid=58432 uid=0 result="success"
Dec  5 07:21:53 np0005546954 NetworkManager[55665]: <info>  [1764937313.8180] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58432 uid=0 result="success"
Dec  5 07:21:53 np0005546954 NetworkManager[55665]: <info>  [1764937313.9720] checkpoint[0x5623a1796a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Dec  5 07:21:53 np0005546954 NetworkManager[55665]: <info>  [1764937313.9728] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58432 uid=0 result="success"
Dec  5 07:21:54 np0005546954 ansible-async_wrapper.py[58430]: Module complete (58430)
Dec  5 07:21:54 np0005546954 ansible-async_wrapper.py[58429]: Done in kid B.
Dec  5 07:21:57 np0005546954 python3.9[58870]: ansible-ansible.legacy.async_status Invoked with jid=j226898819187.58426 mode=status _async_dir=/root/.ansible_async
Dec  5 07:21:57 np0005546954 python3.9[58970]: ansible-ansible.legacy.async_status Invoked with jid=j226898819187.58426 mode=cleanup _async_dir=/root/.ansible_async
Dec  5 07:21:58 np0005546954 python3.9[59122]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:21:58 np0005546954 python3.9[59245]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764937317.8114064-625-137067066023625/.source.returncode _original_basename=._u8g8r1e follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:21:59 np0005546954 python3.9[59397]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:22:00 np0005546954 python3.9[59520]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764937319.110162-657-212090139307689/.source.cfg _original_basename=.yt0locqn follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:22:01 np0005546954 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec  5 07:22:01 np0005546954 python3.9[59673]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  5 07:22:01 np0005546954 systemd[1]: Reloading Network Manager...
Dec  5 07:22:01 np0005546954 NetworkManager[55665]: <info>  [1764937321.2992] audit: op="reload" arg="0" pid=59679 uid=0 result="success"
Dec  5 07:22:01 np0005546954 NetworkManager[55665]: <info>  [1764937321.3003] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Dec  5 07:22:01 np0005546954 systemd[1]: Reloaded Network Manager.
Dec  5 07:22:01 np0005546954 systemd[1]: session-13.scope: Deactivated successfully.
Dec  5 07:22:01 np0005546954 systemd[1]: session-13.scope: Consumed 50.514s CPU time.
Dec  5 07:22:01 np0005546954 systemd-logind[789]: Session 13 logged out. Waiting for processes to exit.
Dec  5 07:22:01 np0005546954 systemd-logind[789]: Removed session 13.
Dec  5 07:22:07 np0005546954 systemd-logind[789]: New session 14 of user zuul.
Dec  5 07:22:07 np0005546954 systemd[1]: Started Session 14 of User zuul.
Dec  5 07:22:08 np0005546954 python3.9[59864]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 07:22:09 np0005546954 python3.9[60018]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  5 07:22:11 np0005546954 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  5 07:22:11 np0005546954 python3.9[60208]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:22:11 np0005546954 systemd[1]: session-14.scope: Deactivated successfully.
Dec  5 07:22:11 np0005546954 systemd[1]: session-14.scope: Consumed 2.212s CPU time.
Dec  5 07:22:11 np0005546954 systemd-logind[789]: Session 14 logged out. Waiting for processes to exit.
Dec  5 07:22:11 np0005546954 systemd-logind[789]: Removed session 14.
Dec  5 07:22:17 np0005546954 systemd-logind[789]: New session 15 of user zuul.
Dec  5 07:22:17 np0005546954 systemd[1]: Started Session 15 of User zuul.
Dec  5 07:22:18 np0005546954 python3.9[60390]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 07:22:18 np0005546954 python3.9[60544]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 07:22:19 np0005546954 python3.9[60701]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  5 07:22:20 np0005546954 python3.9[60785]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  5 07:22:22 np0005546954 python3.9[60938]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  5 07:22:24 np0005546954 python3.9[61130]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:22:24 np0005546954 python3.9[61282]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:22:24 np0005546954 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 07:22:25 np0005546954 python3.9[61446]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:22:26 np0005546954 python3.9[61524]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:22:26 np0005546954 python3.9[61676]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:22:27 np0005546954 python3.9[61754]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:22:28 np0005546954 python3.9[61906]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:22:28 np0005546954 python3.9[62058]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:22:29 np0005546954 python3.9[62210]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:22:29 np0005546954 python3.9[62362]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:22:30 np0005546954 python3.9[62514]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  5 07:22:33 np0005546954 python3.9[62667]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 07:22:34 np0005546954 python3.9[62821]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 07:22:35 np0005546954 python3.9[62973]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 07:22:35 np0005546954 python3.9[63125]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:22:36 np0005546954 python3.9[63278]: ansible-service_facts Invoked
Dec  5 07:22:36 np0005546954 network[63295]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  5 07:22:36 np0005546954 network[63296]: 'network-scripts' will be removed from distribution in near future.
Dec  5 07:22:36 np0005546954 network[63297]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  5 07:22:42 np0005546954 python3.9[63749]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  5 07:22:45 np0005546954 python3.9[63902]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Dec  5 07:22:46 np0005546954 python3.9[64054]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:22:47 np0005546954 python3.9[64179]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764937365.9402568-445-154862035821354/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:22:47 np0005546954 python3.9[64333]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:22:48 np0005546954 python3.9[64458]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764937367.457847-475-136523280232998/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:22:50 np0005546954 python3.9[64612]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:22:51 np0005546954 python3.9[64766]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  5 07:22:52 np0005546954 python3.9[64850]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 07:22:53 np0005546954 python3.9[65004]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  5 07:22:54 np0005546954 python3.9[65088]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  5 07:22:54 np0005546954 chronyd[791]: chronyd exiting
Dec  5 07:22:54 np0005546954 systemd[1]: Stopping NTP client/server...
Dec  5 07:22:54 np0005546954 systemd[1]: chronyd.service: Deactivated successfully.
Dec  5 07:22:54 np0005546954 systemd[1]: Stopped NTP client/server.
Dec  5 07:22:54 np0005546954 systemd[1]: Starting NTP client/server...
Dec  5 07:22:54 np0005546954 chronyd[65097]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Dec  5 07:22:54 np0005546954 chronyd[65097]: Frequency -31.709 +/- 0.252 ppm read from /var/lib/chrony/drift
Dec  5 07:22:54 np0005546954 chronyd[65097]: Loaded seccomp filter (level 2)
Dec  5 07:22:54 np0005546954 systemd[1]: Started NTP client/server.
Dec  5 07:22:55 np0005546954 systemd[1]: session-15.scope: Deactivated successfully.
Dec  5 07:22:55 np0005546954 systemd[1]: session-15.scope: Consumed 25.818s CPU time.
Dec  5 07:22:55 np0005546954 systemd-logind[789]: Session 15 logged out. Waiting for processes to exit.
Dec  5 07:22:55 np0005546954 systemd-logind[789]: Removed session 15.
Dec  5 07:23:01 np0005546954 systemd-logind[789]: New session 16 of user zuul.
Dec  5 07:23:01 np0005546954 systemd[1]: Started Session 16 of User zuul.
Dec  5 07:23:02 np0005546954 python3.9[65276]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 07:23:03 np0005546954 python3.9[65432]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:23:04 np0005546954 python3.9[65607]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:23:04 np0005546954 python3.9[65685]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.a5pjs8va recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:23:05 np0005546954 python3.9[65837]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:23:06 np0005546954 python3.9[65960]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764937385.0348108-103-202020960712307/.source _original_basename=.2xjkek4h follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:23:06 np0005546954 python3.9[66112]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:23:07 np0005546954 python3.9[66264]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:23:08 np0005546954 python3.9[66387]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764937387.0293183-151-184599263772601/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:23:08 np0005546954 python3.9[66539]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:23:09 np0005546954 python3.9[66662]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764937388.181063-151-134840346689654/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:23:09 np0005546954 python3.9[66814]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:23:10 np0005546954 python3.9[66966]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:23:11 np0005546954 python3.9[67089]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937390.1582108-225-100723579401288/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:23:11 np0005546954 python3.9[67241]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:23:12 np0005546954 python3.9[67364]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937391.2733965-255-78114648598404/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:23:13 np0005546954 python3.9[67516]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 07:23:13 np0005546954 systemd[1]: Reloading.
Dec  5 07:23:13 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:23:13 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:23:13 np0005546954 systemd[1]: Reloading.
Dec  5 07:23:13 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:23:13 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:23:14 np0005546954 systemd[1]: Starting EDPM Container Shutdown...
Dec  5 07:23:14 np0005546954 systemd[1]: Finished EDPM Container Shutdown.
Dec  5 07:23:14 np0005546954 python3.9[67742]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:23:15 np0005546954 python3.9[67865]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937394.232528-301-258648883662433/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:23:15 np0005546954 python3.9[68017]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:23:16 np0005546954 python3.9[68140]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937395.3896034-331-95945661882429/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:23:17 np0005546954 python3.9[68292]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 07:23:17 np0005546954 systemd[1]: Reloading.
Dec  5 07:23:17 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:23:17 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:23:17 np0005546954 systemd[1]: Reloading.
Dec  5 07:23:17 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:23:17 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:23:17 np0005546954 systemd[1]: Starting Create netns directory...
Dec  5 07:23:17 np0005546954 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec  5 07:23:17 np0005546954 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec  5 07:23:17 np0005546954 systemd[1]: Finished Create netns directory.
Dec  5 07:23:18 np0005546954 python3.9[68519]: ansible-ansible.builtin.service_facts Invoked
Dec  5 07:23:18 np0005546954 network[68536]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  5 07:23:18 np0005546954 network[68537]: 'network-scripts' will be removed from distribution in near future.
Dec  5 07:23:18 np0005546954 network[68538]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  5 07:23:21 np0005546954 python3.9[68800]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 07:23:21 np0005546954 systemd[1]: Reloading.
Dec  5 07:23:21 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:23:21 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:23:22 np0005546954 systemd[1]: Stopping IPv4 firewall with iptables...
Dec  5 07:23:22 np0005546954 iptables.init[68841]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Dec  5 07:23:22 np0005546954 iptables.init[68841]: iptables: Flushing firewall rules: [  OK  ]
Dec  5 07:23:22 np0005546954 systemd[1]: iptables.service: Deactivated successfully.
Dec  5 07:23:22 np0005546954 systemd[1]: Stopped IPv4 firewall with iptables.
Dec  5 07:23:23 np0005546954 python3.9[69037]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 07:23:23 np0005546954 python3.9[69191]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 07:23:24 np0005546954 systemd[1]: Reloading.
Dec  5 07:23:24 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:23:24 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:23:24 np0005546954 systemd[1]: Starting Netfilter Tables...
Dec  5 07:23:24 np0005546954 systemd[1]: Finished Netfilter Tables.
Dec  5 07:23:25 np0005546954 python3.9[69383]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:23:26 np0005546954 python3.9[69536]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:23:26 np0005546954 python3.9[69661]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764937405.6284769-469-4999395797539/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:23:27 np0005546954 python3.9[69814]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  5 07:23:27 np0005546954 systemd[1]: Reloading OpenSSH server daemon...
Dec  5 07:23:27 np0005546954 systemd[1]: Reloaded OpenSSH server daemon.
Dec  5 07:23:28 np0005546954 python3.9[69970]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:23:29 np0005546954 python3.9[70122]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:23:29 np0005546954 python3.9[70245]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937408.547103-531-123706255875458/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:23:30 np0005546954 python3.9[70397]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec  5 07:23:30 np0005546954 systemd[1]: Starting Time & Date Service...
Dec  5 07:23:30 np0005546954 systemd[1]: Started Time & Date Service.
Dec  5 07:23:31 np0005546954 python3.9[70553]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:23:32 np0005546954 python3.9[70705]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:23:32 np0005546954 python3.9[70828]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764937411.9649358-601-173469137394952/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:23:33 np0005546954 python3.9[70980]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:23:34 np0005546954 python3.9[71103]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764937413.1394868-631-173685444623490/.source.yaml _original_basename=.o4nthdau follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:23:34 np0005546954 python3.9[71255]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:23:35 np0005546954 python3.9[71378]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937414.3434362-661-82032948963242/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:23:36 np0005546954 python3.9[71530]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:23:36 np0005546954 python3.9[71683]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:23:37 np0005546954 python3[71836]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec  5 07:23:38 np0005546954 python3.9[71988]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:23:38 np0005546954 python3.9[72111]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937417.7682054-739-187775826594922/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:23:39 np0005546954 python3.9[72263]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:23:40 np0005546954 python3.9[72386]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937419.0060802-769-210147521748489/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:23:40 np0005546954 python3.9[72538]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:23:41 np0005546954 python3.9[72661]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937420.2638419-799-257499731667131/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:23:41 np0005546954 python3.9[72813]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:23:42 np0005546954 python3.9[72936]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937421.4851131-829-16720876928576/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:23:43 np0005546954 python3.9[73090]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:23:43 np0005546954 python3.9[73213]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937422.858312-859-256951103694941/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:23:44 np0005546954 python3.9[73365]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:23:45 np0005546954 python3.9[73517]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:23:46 np0005546954 python3.9[73676]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:23:46 np0005546954 python3.9[73829]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:23:47 np0005546954 python3.9[73981]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:23:48 np0005546954 python3.9[74133]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec  5 07:23:48 np0005546954 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  5 07:23:48 np0005546954 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  5 07:23:50 np0005546954 python3.9[74287]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec  5 07:23:50 np0005546954 systemd[1]: session-16.scope: Deactivated successfully.
Dec  5 07:23:50 np0005546954 systemd[1]: session-16.scope: Consumed 34.013s CPU time.
Dec  5 07:23:50 np0005546954 systemd-logind[789]: Session 16 logged out. Waiting for processes to exit.
Dec  5 07:23:50 np0005546954 systemd-logind[789]: Removed session 16.
Dec  5 07:23:56 np0005546954 systemd-logind[789]: New session 17 of user zuul.
Dec  5 07:23:56 np0005546954 systemd[1]: Started Session 17 of User zuul.
Dec  5 07:23:56 np0005546954 python3.9[74468]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Dec  5 07:23:57 np0005546954 python3.9[74620]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 07:23:58 np0005546954 python3.9[74772]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 07:23:59 np0005546954 python3.9[74924]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQClkuajhcgxBlWqexAAQ7vLl3HwckQAJS4mlXUD8htv//QYG3fQayS3GhUS6Byh++/drzAdIGZHvEHrsdEc87kJ5LFycv3APgXkrSmznWA0qr6/vxfViUGmVZTvBupLwGtVZkiWAXqiqMYvY7EFw8NJtxIbdzg74GWLkANMnAyjrqppEPRg8fUBMEBQoUWSodFRPGkl5tgbQ9iQsBh0PbO9EPpcA2gCBsoaYvWrAFH+uC7zC3udHCm9o1l46fx5uio41Ix455wSZKwnxEzn1CJAripOWWNbWC3rqg9nAxhsn95RNWQAJZc8kqTja/qimoU2k/ZnmasqfAhHU3UWqp7Qf/PfGsYsrqI+xS++9iT13xxSC/cqGQ3YvqPDhuh6HoRjDatoATaQjiu6zPZLZ46p8aNOLk3fsUP7MoriGIOfQotfqIhrnXdRAAWheizKjj1TJneb7SKOQ6MyBOFnyHLMz6Jfq4b65PM3AAnmZKOwDfETVLaqmGsxEnMQmf4CMO8=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIIiB7PKVPC8GEkX3J0qg254MNFZoGkb3fEDn3gWcAxWr#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOythqhP1XsjL6ud77xFMQGJEIXov7vNcaM6K2FkJraq01uO/RYNnZNiDfWjj5Adt21nHYDPXzZj8g0Tq8H8MVM=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCo4ZDxRf6rnZi23TyfXU8gD/cqHjwuEsuSREGwe4ef7cMarBGNYbXxE2uGww7+rCBvVO/sk+/0L0Gl/KgUI9z2oqrerjCKCsYuyAttWzAL+GwxeYVIkV7aMrv14qfKJInma3jCImb8J327pv+aiZ+JMBPXaBrDvbehUivX7oqVK/PGWn57S/YMzSHEvRYIwR+sLl4LOzNPjoGVZC3PaP9BXJKEazr2Z1tR9MQgIwo5P2xw2pWMR2OK6UKptd47VFj1KAai2IyoNBVvKu7Eu0Bl7049xf4qG+JDfvl7j15BF+n0OL0BodffoBPkcjFq0KP4guYVIwdnnh8+bJ3a1xApfZOVD3vyK2r/VRuSYU9yL0E7AEzmYH9LCJP7s1pkF8/oSijembyTmD8H54WI6XInAHzaT0EvaIIiU+BloxyqVyKKADV+MqnZGaQcBYS7gnoBYPNpSm2r6ViXDvdi2qvL/wsCAYYwa2o8P7kZbK9meUYXjalJHOdoQc16uSCGaNs=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAINaPz5uAdO6tkPM4jSv0B0ohJy4W7MB4/KYc2kpahjdj#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBEHVWl+CHlGNSRLnOfM5r+IcQzz+9WP12BWCpmALuzl2BsNlP7R8imMxEnAMq5GVqoGE5o4Bid/0RL33ENhUHsE=#012 create=True mode=0644 path=/tmp/ansible.bjy7wr30 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:24:00 np0005546954 python3.9[75076]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.bjy7wr30' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:24:00 np0005546954 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec  5 07:24:01 np0005546954 python3.9[75230]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.bjy7wr30 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:24:01 np0005546954 systemd[1]: session-17.scope: Deactivated successfully.
Dec  5 07:24:01 np0005546954 systemd[1]: session-17.scope: Consumed 3.248s CPU time.
Dec  5 07:24:01 np0005546954 systemd-logind[789]: Session 17 logged out. Waiting for processes to exit.
Dec  5 07:24:01 np0005546954 systemd-logind[789]: Removed session 17.
Dec  5 07:24:06 np0005546954 systemd-logind[789]: New session 18 of user zuul.
Dec  5 07:24:07 np0005546954 systemd[1]: Started Session 18 of User zuul.
Dec  5 07:24:08 np0005546954 python3.9[75411]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 07:24:09 np0005546954 python3.9[75567]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec  5 07:24:10 np0005546954 python3.9[75721]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  5 07:24:13 np0005546954 python3.9[75874]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:24:13 np0005546954 python3.9[76027]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 07:24:14 np0005546954 python3.9[76181]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:24:15 np0005546954 python3.9[76336]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:24:15 np0005546954 systemd[1]: session-18.scope: Deactivated successfully.
Dec  5 07:24:15 np0005546954 systemd[1]: session-18.scope: Consumed 4.500s CPU time.
Dec  5 07:24:15 np0005546954 systemd-logind[789]: Session 18 logged out. Waiting for processes to exit.
Dec  5 07:24:15 np0005546954 systemd-logind[789]: Removed session 18.
Dec  5 07:24:21 np0005546954 systemd-logind[789]: New session 19 of user zuul.
Dec  5 07:24:21 np0005546954 systemd[1]: Started Session 19 of User zuul.
Dec  5 07:24:22 np0005546954 python3.9[76514]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 07:24:23 np0005546954 python3.9[76670]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  5 07:24:24 np0005546954 python3.9[76754]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  5 07:24:26 np0005546954 python3.9[76905]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:24:27 np0005546954 python3.9[77056]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  5 07:24:28 np0005546954 python3.9[77206]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 07:24:28 np0005546954 python3.9[77356]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 07:24:29 np0005546954 systemd[1]: session-19.scope: Deactivated successfully.
Dec  5 07:24:29 np0005546954 systemd[1]: session-19.scope: Consumed 5.792s CPU time.
Dec  5 07:24:29 np0005546954 systemd-logind[789]: Session 19 logged out. Waiting for processes to exit.
Dec  5 07:24:29 np0005546954 systemd-logind[789]: Removed session 19.
Dec  5 07:24:34 np0005546954 systemd-logind[789]: New session 20 of user zuul.
Dec  5 07:24:34 np0005546954 systemd[1]: Started Session 20 of User zuul.
Dec  5 07:24:35 np0005546954 python3.9[77534]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 07:24:37 np0005546954 python3.9[77690]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:24:37 np0005546954 python3.9[77842]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:24:38 np0005546954 python3.9[77994]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:24:39 np0005546954 python3.9[78117]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937478.14547-113-260562571889536/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=b097038a7e5f56bfaf3c06829487ce0b350052df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:24:40 np0005546954 python3.9[78269]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:24:40 np0005546954 python3.9[78392]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937479.6165705-113-79394595518775/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=be95063a282c99bc302f6de87baccb7047985949 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:24:41 np0005546954 python3.9[78544]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:24:41 np0005546954 python3.9[78667]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937480.7046185-113-106807925893671/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=91b5135fc057f15ff28b83ce5e129e1a319c112c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:24:42 np0005546954 python3.9[78819]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:24:43 np0005546954 python3.9[78971]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:24:43 np0005546954 python3.9[79123]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:24:44 np0005546954 python3.9[79246]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937483.2801597-231-56493390758402/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=e4f1cfa38d40b2cf3b18ea464d9a51217b7f27fb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:24:45 np0005546954 python3.9[79398]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:24:45 np0005546954 python3.9[79521]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937484.6346834-231-195446963202717/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=0b240d8b817a8d5ea30142f7db496ef6e211376c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:24:46 np0005546954 python3.9[79673]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:24:46 np0005546954 python3.9[79796]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937485.8199248-231-237809091194194/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=de88b30bc05ce678084f0db8e466a8baa8ab4ed4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:24:47 np0005546954 python3.9[79948]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:24:48 np0005546954 python3.9[80100]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:24:48 np0005546954 python3.9[80252]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:24:49 np0005546954 python3.9[80375]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937488.3165746-353-203269795765718/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=4b9e3f586dc4ae91406f3d3b3875d98f33e440fe backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:24:49 np0005546954 python3.9[80527]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:24:50 np0005546954 python3.9[80650]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937489.4905012-353-17687989086131/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=c67d0fa3190548453210fe4d523f867fe3d85cfa backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:24:51 np0005546954 python3.9[80802]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:24:51 np0005546954 python3.9[80925]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937490.6578412-353-98484769723079/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=675b875bcdbe832764fb19149da5eae83484fae7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:24:52 np0005546954 python3.9[81077]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:24:53 np0005546954 python3.9[81229]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:24:53 np0005546954 python3.9[81381]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:24:54 np0005546954 python3.9[81504]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937493.2842724-475-28357876353527/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=c71ecf7eb490bd2257518ab01f04b78cfd022824 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:24:54 np0005546954 python3.9[81656]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:24:55 np0005546954 python3.9[81779]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937494.4665856-475-158855034754122/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=c67d0fa3190548453210fe4d523f867fe3d85cfa backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:24:56 np0005546954 python3.9[81931]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:24:56 np0005546954 python3.9[82054]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937495.6445498-475-113265340846041/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=6fe07c7b284e3b056ea6e2784c6fbcd155f9be9b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:24:57 np0005546954 python3.9[82206]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:24:58 np0005546954 python3.9[82358]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:24:59 np0005546954 python3.9[82481]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937498.1353357-607-268149727458555/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=59ae341700d0fc55e2ef1fdd1f1ac8c51deabc0e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:24:59 np0005546954 python3.9[82633]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:25:00 np0005546954 python3.9[82785]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:25:01 np0005546954 python3.9[82908]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937500.1359818-655-118713511623877/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=59ae341700d0fc55e2ef1fdd1f1ac8c51deabc0e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:25:02 np0005546954 python3.9[83060]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:25:02 np0005546954 python3.9[83212]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:25:03 np0005546954 python3.9[83335]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937502.304387-707-134097123535000/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=59ae341700d0fc55e2ef1fdd1f1ac8c51deabc0e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:25:03 np0005546954 chronyd[65097]: Selected source 54.39.23.64 (pool.ntp.org)
Dec  5 07:25:04 np0005546954 python3.9[83487]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:25:04 np0005546954 python3.9[83639]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:25:05 np0005546954 python3.9[83762]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937504.2772193-758-68942085125529/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=59ae341700d0fc55e2ef1fdd1f1ac8c51deabc0e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:25:05 np0005546954 python3.9[83914]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:25:06 np0005546954 python3.9[84066]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:25:07 np0005546954 python3.9[84189]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937506.1129062-806-175766641329512/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=59ae341700d0fc55e2ef1fdd1f1ac8c51deabc0e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:25:07 np0005546954 python3.9[84341]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:25:08 np0005546954 python3.9[84493]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:25:09 np0005546954 python3.9[84616]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937508.1516738-853-180644552626962/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=59ae341700d0fc55e2ef1fdd1f1ac8c51deabc0e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:25:09 np0005546954 python3.9[84768]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:25:10 np0005546954 python3.9[84920]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:25:11 np0005546954 python3.9[85043]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937510.048282-900-142483896307812/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=59ae341700d0fc55e2ef1fdd1f1ac8c51deabc0e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:25:11 np0005546954 systemd[1]: session-20.scope: Deactivated successfully.
Dec  5 07:25:11 np0005546954 systemd[1]: session-20.scope: Consumed 28.699s CPU time.
Dec  5 07:25:11 np0005546954 systemd-logind[789]: Session 20 logged out. Waiting for processes to exit.
Dec  5 07:25:11 np0005546954 systemd-logind[789]: Removed session 20.
Dec  5 07:25:17 np0005546954 systemd-logind[789]: New session 21 of user zuul.
Dec  5 07:25:17 np0005546954 systemd[1]: Started Session 21 of User zuul.
Dec  5 07:25:18 np0005546954 python3.9[85221]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 07:25:19 np0005546954 python3.9[85377]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:25:19 np0005546954 python3.9[85529]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:25:20 np0005546954 python3.9[85679]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 07:25:21 np0005546954 python3.9[85831]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec  5 07:25:23 np0005546954 dbus-broker-launch[769]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Dec  5 07:25:23 np0005546954 python3.9[85987]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  5 07:25:24 np0005546954 python3.9[86071]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  5 07:25:27 np0005546954 python3.9[86224]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  5 07:25:28 np0005546954 python3[86379]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012  rule:#012    proto: udp#012    dport: 4789#012- rule_name: 119 neutron geneve networks#012  rule:#012    proto: udp#012    dport: 6081#012    state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: OUTPUT#012    jump: NOTRACK#012    action: append#012    state: []#012- rule_name: 121 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: PREROUTING#012    jump: NOTRACK#012    action: append#012    state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Dec  5 07:25:28 np0005546954 python3.9[86531]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:25:29 np0005546954 python3.9[86683]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:25:30 np0005546954 python3.9[86761]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:25:30 np0005546954 python3.9[86913]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:25:31 np0005546954 python3.9[86991]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.rj5d5v2i recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:25:31 np0005546954 python3.9[87143]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:25:32 np0005546954 python3.9[87221]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:25:33 np0005546954 python3.9[87373]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:25:34 np0005546954 python3[87526]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec  5 07:25:34 np0005546954 python3.9[87678]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:25:35 np0005546954 python3.9[87803]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937534.443149-295-249379157310053/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:25:36 np0005546954 python3.9[87955]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:25:37 np0005546954 python3.9[88080]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937535.938981-325-158146324342135/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:25:37 np0005546954 python3.9[88232]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:25:38 np0005546954 python3.9[88357]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937537.4406917-355-133193172583355/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:25:39 np0005546954 python3.9[88509]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:25:39 np0005546954 python3.9[88634]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937538.7247005-385-102367464850407/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:25:40 np0005546954 python3.9[88786]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:25:41 np0005546954 python3.9[88911]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937539.968979-415-195219951815019/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:25:41 np0005546954 python3.9[89063]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:25:42 np0005546954 python3.9[89215]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:25:43 np0005546954 python3.9[89370]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:25:44 np0005546954 python3.9[89522]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:25:45 np0005546954 python3.9[89675]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 07:25:46 np0005546954 python3.9[89829]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:25:46 np0005546954 python3.9[89984]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:25:48 np0005546954 python3.9[90134]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 07:25:49 np0005546954 python3.9[90287]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:3e:0a:f2:93:49:d5" external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:25:49 np0005546954 ovs-vsctl[90288]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:3e:0a:f2:93:49:d5 external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Dec  5 07:25:50 np0005546954 python3.9[90440]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:25:50 np0005546954 python3.9[90595]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:25:50 np0005546954 ovs-vsctl[90596]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Dec  5 07:25:51 np0005546954 python3.9[90746]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 07:25:52 np0005546954 python3.9[90900]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:25:52 np0005546954 python3.9[91052]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:25:53 np0005546954 python3.9[91130]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:25:53 np0005546954 python3.9[91282]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:25:54 np0005546954 python3.9[91360]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:25:55 np0005546954 python3.9[91512]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:25:56 np0005546954 python3.9[91664]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:25:56 np0005546954 python3.9[91742]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:25:57 np0005546954 python3.9[91894]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:25:57 np0005546954 python3.9[91972]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:25:58 np0005546954 python3.9[92124]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 07:25:58 np0005546954 systemd[1]: Reloading.
Dec  5 07:25:58 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:25:58 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:25:59 np0005546954 python3.9[92314]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:26:00 np0005546954 python3.9[92392]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:26:00 np0005546954 python3.9[92544]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:26:01 np0005546954 python3.9[92622]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:26:02 np0005546954 python3.9[92774]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 07:26:02 np0005546954 systemd[1]: Reloading.
Dec  5 07:26:02 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:26:02 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:26:02 np0005546954 systemd[1]: Starting Create netns directory...
Dec  5 07:26:02 np0005546954 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec  5 07:26:02 np0005546954 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec  5 07:26:02 np0005546954 systemd[1]: Finished Create netns directory.
Dec  5 07:26:03 np0005546954 python3.9[92967]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:26:04 np0005546954 python3.9[93119]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:26:04 np0005546954 python3.9[93242]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764937563.6605015-917-107729052456658/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:26:05 np0005546954 python3.9[93394]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:26:06 np0005546954 python3.9[93546]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:26:07 np0005546954 python3.9[93669]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764937565.8672879-967-76533769547359/.source.json _original_basename=.1vce5ld3 follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:26:07 np0005546954 python3.9[93821]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:26:10 np0005546954 python3.9[94248]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Dec  5 07:26:11 np0005546954 python3.9[94400]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  5 07:26:12 np0005546954 python3.9[94553]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec  5 07:26:12 np0005546954 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 07:26:14 np0005546954 python3[94716]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec  5 07:26:14 np0005546954 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 07:26:14 np0005546954 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 07:26:14 np0005546954 podman[94751]: 2025-12-05 12:26:14.262874622 +0000 UTC m=+0.072902630 container create 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller)
Dec  5 07:26:14 np0005546954 podman[94751]: 2025-12-05 12:26:14.217029183 +0000 UTC m=+0.027057211 image pull 3a37a52861b2e44ebd2a63ca2589a7c9d8e4119e5feace9d19c6312ed9b8421c quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec  5 07:26:14 np0005546954 python3[94716]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec  5 07:26:15 np0005546954 python3.9[94941]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 07:26:15 np0005546954 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 07:26:16 np0005546954 python3.9[95095]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:26:16 np0005546954 python3.9[95171]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 07:26:17 np0005546954 python3.9[95322]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764937576.5459208-1143-258102725176981/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:26:17 np0005546954 python3.9[95398]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  5 07:26:17 np0005546954 systemd[1]: Reloading.
Dec  5 07:26:17 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:26:17 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:26:18 np0005546954 python3.9[95509]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 07:26:18 np0005546954 systemd[1]: Reloading.
Dec  5 07:26:18 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:26:18 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:26:18 np0005546954 systemd[1]: Starting ovn_controller container...
Dec  5 07:26:18 np0005546954 systemd[1]: Created slice Virtual Machine and Container Slice.
Dec  5 07:26:18 np0005546954 systemd[1]: Started libcrun container.
Dec  5 07:26:18 np0005546954 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc69671dc2bc3a5ced328cc0a50815fa8824a8952f77873bad3abc58c3b19386/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec  5 07:26:18 np0005546954 systemd[1]: Started /usr/bin/podman healthcheck run 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc.
Dec  5 07:26:18 np0005546954 podman[95551]: 2025-12-05 12:26:18.997978571 +0000 UTC m=+0.137562839 container init 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  5 07:26:19 np0005546954 ovn_controller[95566]: + sudo -E kolla_set_configs
Dec  5 07:26:19 np0005546954 podman[95551]: 2025-12-05 12:26:19.028544627 +0000 UTC m=+0.168128885 container start 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller)
Dec  5 07:26:19 np0005546954 edpm-start-podman-container[95551]: ovn_controller
Dec  5 07:26:19 np0005546954 systemd[1]: Created slice User Slice of UID 0.
Dec  5 07:26:19 np0005546954 systemd[1]: Starting User Runtime Directory /run/user/0...
Dec  5 07:26:19 np0005546954 systemd[1]: Finished User Runtime Directory /run/user/0.
Dec  5 07:26:19 np0005546954 systemd[1]: Starting User Manager for UID 0...
Dec  5 07:26:19 np0005546954 edpm-start-podman-container[95550]: Creating additional drop-in dependency for "ovn_controller" (0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc)
Dec  5 07:26:19 np0005546954 podman[95573]: 2025-12-05 12:26:19.104585371 +0000 UTC m=+0.064094473 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec  5 07:26:19 np0005546954 systemd[1]: 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc-468c32e38227061c.service: Main process exited, code=exited, status=1/FAILURE
Dec  5 07:26:19 np0005546954 systemd[1]: 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc-468c32e38227061c.service: Failed with result 'exit-code'.
Dec  5 07:26:19 np0005546954 systemd[1]: Reloading.
Dec  5 07:26:19 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:26:19 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:26:19 np0005546954 systemd[95610]: Queued start job for default target Main User Target.
Dec  5 07:26:19 np0005546954 systemd[95610]: Created slice User Application Slice.
Dec  5 07:26:19 np0005546954 systemd[95610]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Dec  5 07:26:19 np0005546954 systemd[95610]: Started Daily Cleanup of User's Temporary Directories.
Dec  5 07:26:19 np0005546954 systemd[95610]: Reached target Paths.
Dec  5 07:26:19 np0005546954 systemd[95610]: Reached target Timers.
Dec  5 07:26:19 np0005546954 systemd[95610]: Starting D-Bus User Message Bus Socket...
Dec  5 07:26:19 np0005546954 systemd[95610]: Starting Create User's Volatile Files and Directories...
Dec  5 07:26:19 np0005546954 systemd[95610]: Finished Create User's Volatile Files and Directories.
Dec  5 07:26:19 np0005546954 systemd[95610]: Listening on D-Bus User Message Bus Socket.
Dec  5 07:26:19 np0005546954 systemd[95610]: Reached target Sockets.
Dec  5 07:26:19 np0005546954 systemd[95610]: Reached target Basic System.
Dec  5 07:26:19 np0005546954 systemd[95610]: Reached target Main User Target.
Dec  5 07:26:19 np0005546954 systemd[95610]: Startup finished in 128ms.
Dec  5 07:26:19 np0005546954 systemd[1]: Started User Manager for UID 0.
Dec  5 07:26:19 np0005546954 systemd[1]: Started ovn_controller container.
Dec  5 07:26:19 np0005546954 systemd[1]: Started Session c1 of User root.
Dec  5 07:26:19 np0005546954 ovn_controller[95566]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  5 07:26:19 np0005546954 ovn_controller[95566]: INFO:__main__:Validating config file
Dec  5 07:26:19 np0005546954 ovn_controller[95566]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  5 07:26:19 np0005546954 ovn_controller[95566]: INFO:__main__:Writing out command to execute
Dec  5 07:26:19 np0005546954 systemd[1]: session-c1.scope: Deactivated successfully.
Dec  5 07:26:19 np0005546954 ovn_controller[95566]: ++ cat /run_command
Dec  5 07:26:19 np0005546954 ovn_controller[95566]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Dec  5 07:26:19 np0005546954 ovn_controller[95566]: + ARGS=
Dec  5 07:26:19 np0005546954 ovn_controller[95566]: + sudo kolla_copy_cacerts
Dec  5 07:26:19 np0005546954 systemd[1]: Started Session c2 of User root.
Dec  5 07:26:19 np0005546954 systemd[1]: session-c2.scope: Deactivated successfully.
Dec  5 07:26:19 np0005546954 ovn_controller[95566]: + [[ ! -n '' ]]
Dec  5 07:26:19 np0005546954 ovn_controller[95566]: + . kolla_extend_start
Dec  5 07:26:19 np0005546954 ovn_controller[95566]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Dec  5 07:26:19 np0005546954 ovn_controller[95566]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Dec  5 07:26:19 np0005546954 ovn_controller[95566]: + umask 0022
Dec  5 07:26:19 np0005546954 ovn_controller[95566]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Dec  5 07:26:19 np0005546954 ovn_controller[95566]: 2025-12-05T12:26:19Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec  5 07:26:19 np0005546954 ovn_controller[95566]: 2025-12-05T12:26:19Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec  5 07:26:19 np0005546954 ovn_controller[95566]: 2025-12-05T12:26:19Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Dec  5 07:26:19 np0005546954 ovn_controller[95566]: 2025-12-05T12:26:19Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Dec  5 07:26:19 np0005546954 ovn_controller[95566]: 2025-12-05T12:26:19Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec  5 07:26:19 np0005546954 ovn_controller[95566]: 2025-12-05T12:26:19Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Dec  5 07:26:19 np0005546954 NetworkManager[55665]: <info>  [1764937579.4818] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Dec  5 07:26:19 np0005546954 NetworkManager[55665]: <info>  [1764937579.4824] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  5 07:26:19 np0005546954 NetworkManager[55665]: <info>  [1764937579.4833] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Dec  5 07:26:19 np0005546954 NetworkManager[55665]: <info>  [1764937579.4837] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Dec  5 07:26:19 np0005546954 NetworkManager[55665]: <info>  [1764937579.4839] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Dec  5 07:26:19 np0005546954 kernel: br-int: entered promiscuous mode
Dec  5 07:26:19 np0005546954 ovn_controller[95566]: 2025-12-05T12:26:19Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Dec  5 07:26:19 np0005546954 ovn_controller[95566]: 2025-12-05T12:26:19Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec  5 07:26:19 np0005546954 ovn_controller[95566]: 2025-12-05T12:26:19Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec  5 07:26:19 np0005546954 ovn_controller[95566]: 2025-12-05T12:26:19Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Dec  5 07:26:19 np0005546954 ovn_controller[95566]: 2025-12-05T12:26:19Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Dec  5 07:26:19 np0005546954 ovn_controller[95566]: 2025-12-05T12:26:19Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Dec  5 07:26:19 np0005546954 ovn_controller[95566]: 2025-12-05T12:26:19Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec  5 07:26:19 np0005546954 ovn_controller[95566]: 2025-12-05T12:26:19Z|00014|main|INFO|OVS feature set changed, force recompute.
Dec  5 07:26:19 np0005546954 ovn_controller[95566]: 2025-12-05T12:26:19Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec  5 07:26:19 np0005546954 ovn_controller[95566]: 2025-12-05T12:26:19Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec  5 07:26:19 np0005546954 ovn_controller[95566]: 2025-12-05T12:26:19Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec  5 07:26:19 np0005546954 ovn_controller[95566]: 2025-12-05T12:26:19Z|00018|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec  5 07:26:19 np0005546954 ovn_controller[95566]: 2025-12-05T12:26:19Z|00019|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Dec  5 07:26:19 np0005546954 ovn_controller[95566]: 2025-12-05T12:26:19Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec  5 07:26:19 np0005546954 ovn_controller[95566]: 2025-12-05T12:26:19Z|00021|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Dec  5 07:26:19 np0005546954 ovn_controller[95566]: 2025-12-05T12:26:19Z|00022|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Dec  5 07:26:19 np0005546954 ovn_controller[95566]: 2025-12-05T12:26:19Z|00023|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Dec  5 07:26:19 np0005546954 ovn_controller[95566]: 2025-12-05T12:26:19Z|00024|main|INFO|OVS feature set changed, force recompute.
Dec  5 07:26:19 np0005546954 ovn_controller[95566]: 2025-12-05T12:26:19Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec  5 07:26:19 np0005546954 ovn_controller[95566]: 2025-12-05T12:26:19Z|00001|statctrl(ovn_statctrl1)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec  5 07:26:19 np0005546954 ovn_controller[95566]: 2025-12-05T12:26:19Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec  5 07:26:19 np0005546954 ovn_controller[95566]: 2025-12-05T12:26:19Z|00002|rconn(ovn_statctrl1)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec  5 07:26:19 np0005546954 ovn_controller[95566]: 2025-12-05T12:26:19Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec  5 07:26:19 np0005546954 NetworkManager[55665]: <info>  [1764937579.5060] manager: (ovn-97625a-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Dec  5 07:26:19 np0005546954 NetworkManager[55665]: <info>  [1764937579.5069] manager: (ovn-b049cd-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/18)
Dec  5 07:26:19 np0005546954 ovn_controller[95566]: 2025-12-05T12:26:19Z|00003|rconn(ovn_statctrl1)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec  5 07:26:19 np0005546954 kernel: genev_sys_6081: entered promiscuous mode
Dec  5 07:26:19 np0005546954 NetworkManager[55665]: <info>  [1764937579.5281] device (genev_sys_6081): carrier: link connected
Dec  5 07:26:19 np0005546954 NetworkManager[55665]: <info>  [1764937579.5284] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/19)
Dec  5 07:26:19 np0005546954 systemd-udevd[95726]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:26:19 np0005546954 systemd-udevd[95731]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:26:20 np0005546954 python3.9[95836]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:26:20 np0005546954 ovs-vsctl[95837]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Dec  5 07:26:20 np0005546954 python3.9[95989]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:26:20 np0005546954 ovs-vsctl[95991]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Dec  5 07:26:21 np0005546954 python3.9[96144]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:26:21 np0005546954 ovs-vsctl[96145]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Dec  5 07:26:22 np0005546954 systemd[1]: session-21.scope: Deactivated successfully.
Dec  5 07:26:22 np0005546954 systemd[1]: session-21.scope: Consumed 44.549s CPU time.
Dec  5 07:26:22 np0005546954 systemd-logind[789]: Session 21 logged out. Waiting for processes to exit.
Dec  5 07:26:22 np0005546954 systemd-logind[789]: Removed session 21.
Dec  5 07:26:27 np0005546954 systemd-logind[789]: New session 23 of user zuul.
Dec  5 07:26:27 np0005546954 systemd[1]: Started Session 23 of User zuul.
Dec  5 07:26:28 np0005546954 python3.9[96324]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 07:26:29 np0005546954 systemd[1]: Stopping User Manager for UID 0...
Dec  5 07:26:29 np0005546954 systemd[95610]: Activating special unit Exit the Session...
Dec  5 07:26:29 np0005546954 systemd[95610]: Stopped target Main User Target.
Dec  5 07:26:29 np0005546954 systemd[95610]: Stopped target Basic System.
Dec  5 07:26:29 np0005546954 systemd[95610]: Stopped target Paths.
Dec  5 07:26:29 np0005546954 systemd[95610]: Stopped target Sockets.
Dec  5 07:26:29 np0005546954 systemd[95610]: Stopped target Timers.
Dec  5 07:26:29 np0005546954 systemd[95610]: Stopped Daily Cleanup of User's Temporary Directories.
Dec  5 07:26:29 np0005546954 systemd[95610]: Closed D-Bus User Message Bus Socket.
Dec  5 07:26:29 np0005546954 systemd[95610]: Stopped Create User's Volatile Files and Directories.
Dec  5 07:26:29 np0005546954 systemd[95610]: Removed slice User Application Slice.
Dec  5 07:26:29 np0005546954 systemd[95610]: Reached target Shutdown.
Dec  5 07:26:29 np0005546954 systemd[95610]: Finished Exit the Session.
Dec  5 07:26:29 np0005546954 systemd[95610]: Reached target Exit the Session.
Dec  5 07:26:29 np0005546954 systemd[1]: user@0.service: Deactivated successfully.
Dec  5 07:26:29 np0005546954 systemd[1]: Stopped User Manager for UID 0.
Dec  5 07:26:29 np0005546954 systemd[1]: Stopping User Runtime Directory /run/user/0...
Dec  5 07:26:29 np0005546954 systemd[1]: run-user-0.mount: Deactivated successfully.
Dec  5 07:26:29 np0005546954 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Dec  5 07:26:29 np0005546954 systemd[1]: Stopped User Runtime Directory /run/user/0.
Dec  5 07:26:29 np0005546954 systemd[1]: Removed slice User Slice of UID 0.
Dec  5 07:26:29 np0005546954 python3.9[96481]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:26:30 np0005546954 python3.9[96633]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:26:31 np0005546954 python3.9[96785]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:26:31 np0005546954 python3.9[96937]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:26:32 np0005546954 python3.9[97089]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:26:33 np0005546954 python3.9[97240]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 07:26:34 np0005546954 python3.9[97392]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec  5 07:26:35 np0005546954 python3.9[97542]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:26:36 np0005546954 python3.9[97663]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764937595.021997-153-194916663348234/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:26:37 np0005546954 python3.9[97813]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:26:37 np0005546954 python3.9[97934]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764937596.5561192-183-253097526084895/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:26:38 np0005546954 python3.9[98087]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  5 07:26:39 np0005546954 python3.9[98171]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  5 07:26:41 np0005546954 python3.9[98324]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  5 07:26:42 np0005546954 python3.9[98477]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:26:43 np0005546954 python3.9[98598]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764937602.272971-257-122576321987642/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:26:44 np0005546954 python3.9[98748]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:26:44 np0005546954 python3.9[98869]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764937603.534426-257-152755278080699/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:26:45 np0005546954 python3.9[99019]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:26:46 np0005546954 python3.9[99140]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764937605.4137478-345-270744643221954/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:26:47 np0005546954 python3.9[99290]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:26:47 np0005546954 python3.9[99411]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764937606.6678236-345-179893417310727/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:26:48 np0005546954 python3.9[99561]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 07:26:49 np0005546954 python3.9[99715]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:26:49 np0005546954 ovn_controller[95566]: 2025-12-05T12:26:49Z|00025|memory|INFO|16000 kB peak resident set size after 30.1 seconds
Dec  5 07:26:49 np0005546954 ovn_controller[95566]: 2025-12-05T12:26:49Z|00026|memory|INFO|idl-cells-OVN_Southbound:256 idl-cells-Open_vSwitch:528 ofctrl_desired_flow_usage-KB:6 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:2
Dec  5 07:26:49 np0005546954 podman[99792]: 2025-12-05 12:26:49.655057167 +0000 UTC m=+0.155927195 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  5 07:26:49 np0005546954 python3.9[99893]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:26:50 np0005546954 python3.9[99971]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:26:51 np0005546954 python3.9[100123]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:26:51 np0005546954 python3.9[100201]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:26:52 np0005546954 python3.9[100353]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:26:52 np0005546954 python3.9[100505]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:26:53 np0005546954 python3.9[100583]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:26:54 np0005546954 python3.9[100735]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:26:54 np0005546954 python3.9[100813]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:26:55 np0005546954 python3.9[100965]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 07:26:55 np0005546954 systemd[1]: Reloading.
Dec  5 07:26:55 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:26:55 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:26:56 np0005546954 python3.9[101154]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:26:56 np0005546954 python3.9[101232]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:26:57 np0005546954 python3.9[101384]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:26:57 np0005546954 python3.9[101462]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:26:58 np0005546954 python3.9[101614]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 07:26:58 np0005546954 systemd[1]: Reloading.
Dec  5 07:26:58 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:26:58 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:26:59 np0005546954 systemd[1]: Starting Create netns directory...
Dec  5 07:26:59 np0005546954 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec  5 07:26:59 np0005546954 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec  5 07:26:59 np0005546954 systemd[1]: Finished Create netns directory.
Dec  5 07:26:59 np0005546954 python3.9[101807]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:27:00 np0005546954 python3.9[101959]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:27:01 np0005546954 python3.9[102082]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764937620.2494712-647-279607297923278/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:27:02 np0005546954 python3.9[102234]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:27:02 np0005546954 python3.9[102386]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:27:03 np0005546954 python3.9[102509]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764937622.4554784-697-68020070560056/.source.json _original_basename=.wwk3e40n follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:27:04 np0005546954 python3.9[102661]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:27:06 np0005546954 python3.9[103088]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Dec  5 07:27:07 np0005546954 python3.9[103240]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  5 07:27:08 np0005546954 python3.9[103392]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec  5 07:27:10 np0005546954 python3[103570]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec  5 07:27:10 np0005546954 podman[103608]: 2025-12-05 12:27:10.225631554 +0000 UTC m=+0.051876232 container create cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3)
Dec  5 07:27:10 np0005546954 podman[103608]: 2025-12-05 12:27:10.199470792 +0000 UTC m=+0.025715490 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:27:10 np0005546954 python3[103570]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:27:11 np0005546954 python3.9[103797]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 07:27:11 np0005546954 python3.9[103951]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:27:12 np0005546954 python3.9[104027]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 07:27:13 np0005546954 python3.9[104178]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764937632.3608816-873-25717394558817/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:27:13 np0005546954 python3.9[104254]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  5 07:27:13 np0005546954 systemd[1]: Reloading.
Dec  5 07:27:13 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:27:13 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:27:14 np0005546954 python3.9[104365]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 07:27:14 np0005546954 systemd[1]: Reloading.
Dec  5 07:27:14 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:27:14 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:27:14 np0005546954 systemd[1]: Starting ovn_metadata_agent container...
Dec  5 07:27:14 np0005546954 systemd[1]: Started libcrun container.
Dec  5 07:27:14 np0005546954 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c82ea63242266d79b4465e11f0a5b2daabdefe33cc8a8e5c8e5d06cbb454e693/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Dec  5 07:27:14 np0005546954 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c82ea63242266d79b4465e11f0a5b2daabdefe33cc8a8e5c8e5d06cbb454e693/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:27:14 np0005546954 systemd[1]: Started /usr/bin/podman healthcheck run cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806.
Dec  5 07:27:14 np0005546954 podman[104406]: 2025-12-05 12:27:14.880534769 +0000 UTC m=+0.133286466 container init cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Dec  5 07:27:14 np0005546954 ovn_metadata_agent[104423]: + sudo -E kolla_set_configs
Dec  5 07:27:14 np0005546954 podman[104406]: 2025-12-05 12:27:14.911578764 +0000 UTC m=+0.164330411 container start cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:27:14 np0005546954 edpm-start-podman-container[104406]: ovn_metadata_agent
Dec  5 07:27:14 np0005546954 ovn_metadata_agent[104423]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  5 07:27:14 np0005546954 ovn_metadata_agent[104423]: INFO:__main__:Validating config file
Dec  5 07:27:14 np0005546954 ovn_metadata_agent[104423]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  5 07:27:14 np0005546954 ovn_metadata_agent[104423]: INFO:__main__:Copying service configuration files
Dec  5 07:27:14 np0005546954 ovn_metadata_agent[104423]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Dec  5 07:27:14 np0005546954 ovn_metadata_agent[104423]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Dec  5 07:27:14 np0005546954 ovn_metadata_agent[104423]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Dec  5 07:27:14 np0005546954 ovn_metadata_agent[104423]: INFO:__main__:Writing out command to execute
Dec  5 07:27:14 np0005546954 ovn_metadata_agent[104423]: INFO:__main__:Setting permission for /var/lib/neutron
Dec  5 07:27:14 np0005546954 ovn_metadata_agent[104423]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Dec  5 07:27:14 np0005546954 ovn_metadata_agent[104423]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Dec  5 07:27:14 np0005546954 ovn_metadata_agent[104423]: INFO:__main__:Setting permission for /var/lib/neutron/external
Dec  5 07:27:14 np0005546954 ovn_metadata_agent[104423]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Dec  5 07:27:14 np0005546954 ovn_metadata_agent[104423]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Dec  5 07:27:14 np0005546954 ovn_metadata_agent[104423]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Dec  5 07:27:14 np0005546954 edpm-start-podman-container[104405]: Creating additional drop-in dependency for "ovn_metadata_agent" (cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806)
Dec  5 07:27:14 np0005546954 podman[104430]: 2025-12-05 12:27:14.968342537 +0000 UTC m=+0.046038362 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true)
Dec  5 07:27:14 np0005546954 ovn_metadata_agent[104423]: ++ cat /run_command
Dec  5 07:27:14 np0005546954 ovn_metadata_agent[104423]: + CMD=neutron-ovn-metadata-agent
Dec  5 07:27:14 np0005546954 ovn_metadata_agent[104423]: + ARGS=
Dec  5 07:27:14 np0005546954 ovn_metadata_agent[104423]: + sudo kolla_copy_cacerts
Dec  5 07:27:14 np0005546954 systemd[1]: Reloading.
Dec  5 07:27:15 np0005546954 ovn_metadata_agent[104423]: + [[ ! -n '' ]]
Dec  5 07:27:15 np0005546954 ovn_metadata_agent[104423]: + . kolla_extend_start
Dec  5 07:27:15 np0005546954 ovn_metadata_agent[104423]: Running command: 'neutron-ovn-metadata-agent'
Dec  5 07:27:15 np0005546954 ovn_metadata_agent[104423]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Dec  5 07:27:15 np0005546954 ovn_metadata_agent[104423]: + umask 0022
Dec  5 07:27:15 np0005546954 ovn_metadata_agent[104423]: + exec neutron-ovn-metadata-agent
Dec  5 07:27:15 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:27:15 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:27:15 np0005546954 systemd[1]: Started ovn_metadata_agent container.
Dec  5 07:27:15 np0005546954 systemd[1]: session-23.scope: Deactivated successfully.
Dec  5 07:27:15 np0005546954 systemd[1]: session-23.scope: Consumed 35.722s CPU time.
Dec  5 07:27:15 np0005546954 systemd-logind[789]: Session 23 logged out. Waiting for processes to exit.
Dec  5 07:27:15 np0005546954 systemd-logind[789]: Removed session 23.
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.875 104428 INFO neutron.common.config [-] Logging enabled!#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.875 104428 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.876 104428 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.876 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.876 104428 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.876 104428 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.876 104428 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.877 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.877 104428 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.877 104428 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.877 104428 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.877 104428 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.877 104428 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.877 104428 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.877 104428 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.877 104428 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.877 104428 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.878 104428 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.878 104428 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.878 104428 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.878 104428 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.878 104428 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.878 104428 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.878 104428 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.878 104428 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.878 104428 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.878 104428 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.879 104428 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.879 104428 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.879 104428 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.879 104428 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.879 104428 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.879 104428 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.879 104428 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.879 104428 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.879 104428 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.880 104428 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.880 104428 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.880 104428 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.880 104428 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.880 104428 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.880 104428 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.880 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.880 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.881 104428 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.881 104428 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.881 104428 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.881 104428 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.881 104428 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.881 104428 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.881 104428 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.881 104428 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.881 104428 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.881 104428 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.882 104428 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.882 104428 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.882 104428 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.882 104428 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.882 104428 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.882 104428 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.882 104428 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.882 104428 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.882 104428 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.882 104428 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.883 104428 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.883 104428 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.883 104428 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.883 104428 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.883 104428 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.883 104428 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.883 104428 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.883 104428 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.883 104428 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.884 104428 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.884 104428 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.884 104428 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.884 104428 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.884 104428 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.884 104428 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.884 104428 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.884 104428 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.886 104428 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.886 104428 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.887 104428 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.887 104428 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.887 104428 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.887 104428 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.887 104428 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.887 104428 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.887 104428 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.887 104428 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.887 104428 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.887 104428 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.888 104428 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.888 104428 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.888 104428 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.888 104428 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.888 104428 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.888 104428 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.888 104428 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.888 104428 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.888 104428 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.888 104428 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.889 104428 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.889 104428 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.889 104428 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.889 104428 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.889 104428 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.889 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.889 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.889 104428 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.890 104428 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.890 104428 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.890 104428 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.890 104428 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.890 104428 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.890 104428 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.890 104428 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.890 104428 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.890 104428 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.890 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.891 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.891 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.891 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.891 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.891 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.891 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.891 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.891 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.891 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.892 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.892 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.892 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.892 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.892 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.892 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.892 104428 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.892 104428 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.892 104428 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.893 104428 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.893 104428 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.893 104428 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.893 104428 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.893 104428 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.893 104428 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.893 104428 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.893 104428 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.893 104428 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.894 104428 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.894 104428 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.894 104428 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.894 104428 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.894 104428 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.894 104428 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.894 104428 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.894 104428 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.894 104428 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.895 104428 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.895 104428 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.895 104428 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.895 104428 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.895 104428 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.895 104428 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.895 104428 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.895 104428 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.895 104428 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.895 104428 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.896 104428 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.896 104428 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.896 104428 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.896 104428 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.896 104428 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.896 104428 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.896 104428 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.896 104428 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.896 104428 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.897 104428 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.897 104428 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.897 104428 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.897 104428 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.897 104428 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.897 104428 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.897 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.897 104428 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.897 104428 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.897 104428 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.898 104428 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.898 104428 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.898 104428 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.898 104428 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.898 104428 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.898 104428 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.898 104428 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.898 104428 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.898 104428 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.899 104428 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.899 104428 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.899 104428 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.899 104428 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.899 104428 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.899 104428 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.899 104428 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.899 104428 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.899 104428 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.899 104428 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.900 104428 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.900 104428 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.900 104428 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.900 104428 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.900 104428 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.900 104428 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.900 104428 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.900 104428 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.900 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.901 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.901 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.901 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.901 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.901 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.901 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.901 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.901 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.901 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.901 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.902 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.902 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.902 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.902 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.902 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.902 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.902 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.902 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.902 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.903 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.903 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.903 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.903 104428 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.903 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.903 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.903 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.903 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.903 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.904 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.904 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.904 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.904 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.904 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.904 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.904 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.904 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.905 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.905 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.905 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.905 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.905 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.905 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.905 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.905 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.905 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.906 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.906 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.906 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.906 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.906 104428 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.906 104428 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.906 104428 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.906 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.907 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.907 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.907 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.907 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.907 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.907 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.908 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.908 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.908 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.908 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.908 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.908 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.908 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.909 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.909 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.909 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.909 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.909 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.909 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.909 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.909 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.909 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.910 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.910 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.910 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.910 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.910 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.910 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.910 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.910 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.910 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.911 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.911 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.911 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.911 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.911 104428 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.911 104428 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.922 104428 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.923 104428 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.923 104428 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.923 104428 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.923 104428 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.936 104428 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 47f9f74c-08f9-451f-9678-93bb9e8fa2fe (UUID: 47f9f74c-08f9-451f-9678-93bb9e8fa2fe) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.962 104428 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.963 104428 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.963 104428 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.963 104428 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.966 104428 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.972 104428 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.977 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '47f9f74c-08f9-451f-9678-93bb9e8fa2fe'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], external_ids={}, name=47f9f74c-08f9-451f-9678-93bb9e8fa2fe, nb_cfg_timestamp=1764937587505, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.978 104428 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7fd854344b80>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.978 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.979 104428 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.979 104428 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.979 104428 INFO oslo_service.service [-] Starting 1 workers#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.984 104428 DEBUG oslo_service.service [-] Started child 104537 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.987 104537 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-455056'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m
Dec  5 07:27:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:16.989 104428 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmp_3l82hbl/privsep.sock']#033[00m
Dec  5 07:27:17 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:17.008 104537 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Dec  5 07:27:17 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:17.008 104537 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Dec  5 07:27:17 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:17.008 104537 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  5 07:27:17 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:17.012 104537 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Dec  5 07:27:17 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:17.017 104537 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Dec  5 07:27:17 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:17.023 104537 INFO eventlet.wsgi.server [-] (104537) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m
Dec  5 07:27:17 np0005546954 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Dec  5 07:27:17 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:17.731 104428 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Dec  5 07:27:17 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:17.732 104428 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp_3l82hbl/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Dec  5 07:27:17 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:17.561 104542 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Dec  5 07:27:17 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:17.565 104542 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Dec  5 07:27:17 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:17.570 104542 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Dec  5 07:27:17 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:17.570 104542 INFO oslo.privsep.daemon [-] privsep daemon running as pid 104542#033[00m
Dec  5 07:27:17 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:17.734 104542 DEBUG oslo.privsep.daemon [-] privsep: reply[39d1a28d-65cf-426c-a7c8-f7f102bfdd28]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.282 104542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.282 104542 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.282 104542 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.891 104542 DEBUG oslo.privsep.daemon [-] privsep: reply[db5c41f5-53d6-4769-8655-9e5baaaff948]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.894 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=47f9f74c-08f9-451f-9678-93bb9e8fa2fe, column=external_ids, values=({'neutron:ovn-metadata-id': '8573c6e0-1094-5064-8562-4056b6c1e4c4'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.901 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f9f74c-08f9-451f-9678-93bb9e8fa2fe, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.912 104428 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.912 104428 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.912 104428 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.912 104428 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.912 104428 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.912 104428 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.913 104428 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.913 104428 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.913 104428 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.913 104428 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.913 104428 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.913 104428 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.913 104428 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.914 104428 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.914 104428 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.914 104428 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.914 104428 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.914 104428 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.914 104428 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.914 104428 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.915 104428 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.915 104428 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.915 104428 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.915 104428 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.915 104428 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.915 104428 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.915 104428 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.915 104428 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.916 104428 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.916 104428 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.916 104428 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.916 104428 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.916 104428 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.916 104428 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.916 104428 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.916 104428 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.917 104428 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.917 104428 DEBUG oslo_service.service [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.917 104428 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.917 104428 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.917 104428 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.917 104428 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.917 104428 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.918 104428 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.918 104428 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.918 104428 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.918 104428 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.918 104428 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.918 104428 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.918 104428 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.918 104428 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.918 104428 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.918 104428 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.919 104428 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.919 104428 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.919 104428 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.919 104428 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.919 104428 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.919 104428 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.919 104428 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.919 104428 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.919 104428 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.919 104428 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.920 104428 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.920 104428 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.920 104428 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.920 104428 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.920 104428 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.920 104428 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.920 104428 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.920 104428 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.920 104428 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.920 104428 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.921 104428 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.921 104428 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.921 104428 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.921 104428 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.921 104428 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.921 104428 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.921 104428 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.921 104428 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.921 104428 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.922 104428 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.922 104428 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.922 104428 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.922 104428 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.922 104428 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.922 104428 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.922 104428 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.922 104428 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.922 104428 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.922 104428 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.923 104428 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.923 104428 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.923 104428 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.923 104428 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.923 104428 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.923 104428 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.923 104428 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.923 104428 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.923 104428 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.923 104428 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.924 104428 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.924 104428 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.924 104428 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.924 104428 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.924 104428 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.924 104428 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.924 104428 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.924 104428 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.925 104428 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.925 104428 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.925 104428 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.925 104428 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.925 104428 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.925 104428 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.926 104428 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.926 104428 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.926 104428 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.926 104428 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.926 104428 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.926 104428 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.927 104428 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.927 104428 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.927 104428 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.927 104428 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.927 104428 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.927 104428 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.927 104428 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.927 104428 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.928 104428 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.928 104428 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.928 104428 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.928 104428 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.928 104428 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.928 104428 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.928 104428 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.929 104428 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.929 104428 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.929 104428 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.929 104428 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.929 104428 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.929 104428 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.930 104428 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.930 104428 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.930 104428 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.930 104428 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.930 104428 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.930 104428 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.930 104428 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.930 104428 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.930 104428 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.930 104428 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.931 104428 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.931 104428 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.931 104428 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.931 104428 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.931 104428 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.931 104428 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.931 104428 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.931 104428 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.931 104428 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.932 104428 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.932 104428 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.932 104428 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.932 104428 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.932 104428 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.932 104428 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.932 104428 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.932 104428 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.932 104428 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.933 104428 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.933 104428 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.933 104428 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.933 104428 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.933 104428 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.933 104428 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.933 104428 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.933 104428 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.933 104428 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.934 104428 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.934 104428 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.934 104428 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.934 104428 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.934 104428 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.934 104428 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.934 104428 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.934 104428 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.935 104428 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.935 104428 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.935 104428 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.935 104428 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.935 104428 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.935 104428 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.935 104428 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.935 104428 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.935 104428 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.936 104428 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.936 104428 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.936 104428 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.936 104428 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.936 104428 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.936 104428 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.936 104428 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.936 104428 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.936 104428 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.937 104428 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.937 104428 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.937 104428 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.937 104428 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.937 104428 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.937 104428 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.937 104428 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.937 104428 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.937 104428 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.937 104428 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.938 104428 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.938 104428 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.938 104428 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.938 104428 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.938 104428 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.938 104428 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.938 104428 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.938 104428 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.938 104428 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.939 104428 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.939 104428 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.939 104428 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.939 104428 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.939 104428 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.939 104428 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.939 104428 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.939 104428 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.939 104428 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.940 104428 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.940 104428 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.940 104428 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.940 104428 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.940 104428 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.940 104428 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.940 104428 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.941 104428 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.941 104428 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.941 104428 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.941 104428 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.941 104428 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.941 104428 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.941 104428 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.941 104428 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.941 104428 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.942 104428 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.942 104428 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.942 104428 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.942 104428 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.942 104428 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.942 104428 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.942 104428 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.942 104428 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.943 104428 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.943 104428 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.943 104428 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.943 104428 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.943 104428 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.943 104428 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.943 104428 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.943 104428 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.943 104428 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.944 104428 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.944 104428 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.944 104428 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.944 104428 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.944 104428 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.944 104428 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.944 104428 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.944 104428 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.945 104428 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.945 104428 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.945 104428 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.945 104428 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.945 104428 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.945 104428 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.945 104428 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.945 104428 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.945 104428 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.946 104428 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.946 104428 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.946 104428 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.946 104428 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.946 104428 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.946 104428 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.946 104428 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.946 104428 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.946 104428 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.947 104428 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.947 104428 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.947 104428 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.947 104428 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.947 104428 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.947 104428 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.947 104428 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.947 104428 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.947 104428 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.948 104428 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:27:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:27:18.948 104428 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Dec  5 07:27:20 np0005546954 podman[104547]: 2025-12-05 12:27:20.608024689 +0000 UTC m=+0.112755017 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2)
Dec  5 07:27:21 np0005546954 systemd-logind[789]: New session 24 of user zuul.
Dec  5 07:27:21 np0005546954 systemd[1]: Started Session 24 of User zuul.
Dec  5 07:27:22 np0005546954 python3.9[104727]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 07:27:23 np0005546954 python3.9[104883]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:27:25 np0005546954 python3.9[105047]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  5 07:27:25 np0005546954 systemd[1]: Reloading.
Dec  5 07:27:25 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:27:25 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:27:26 np0005546954 python3.9[105233]: ansible-ansible.builtin.service_facts Invoked
Dec  5 07:27:26 np0005546954 network[105250]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  5 07:27:26 np0005546954 network[105251]: 'network-scripts' will be removed from distribution in near future.
Dec  5 07:27:26 np0005546954 network[105252]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  5 07:27:32 np0005546954 python3.9[105513]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 07:27:33 np0005546954 python3.9[105666]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 07:27:34 np0005546954 python3.9[105819]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 07:27:35 np0005546954 python3.9[105972]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 07:27:35 np0005546954 python3.9[106125]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 07:27:36 np0005546954 python3.9[106278]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 07:27:37 np0005546954 python3.9[106431]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 07:27:39 np0005546954 python3.9[106584]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:27:39 np0005546954 python3.9[106736]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:27:40 np0005546954 python3.9[106888]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:27:40 np0005546954 python3.9[107040]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:27:41 np0005546954 python3.9[107192]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:27:42 np0005546954 python3.9[107344]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:27:42 np0005546954 python3.9[107496]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:27:43 np0005546954 python3.9[107648]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:27:44 np0005546954 python3.9[107800]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:27:44 np0005546954 python3.9[107952]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:27:45 np0005546954 podman[108076]: 2025-12-05 12:27:45.284382948 +0000 UTC m=+0.071865095 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec  5 07:27:45 np0005546954 python3.9[108119]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:27:46 np0005546954 python3.9[108274]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:27:46 np0005546954 python3.9[108426]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:27:47 np0005546954 python3.9[108578]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:27:48 np0005546954 python3.9[108730]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:27:49 np0005546954 python3.9[108882]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  5 07:27:50 np0005546954 python3.9[109034]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  5 07:27:50 np0005546954 systemd[1]: Reloading.
Dec  5 07:27:50 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:27:50 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:27:51 np0005546954 podman[109193]: 2025-12-05 12:27:51.024349014 +0000 UTC m=+0.101785507 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec  5 07:27:51 np0005546954 python3.9[109240]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:27:51 np0005546954 python3.9[109400]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:27:52 np0005546954 python3.9[109553]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:27:53 np0005546954 python3.9[109706]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:27:53 np0005546954 python3.9[109859]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:27:54 np0005546954 python3.9[110012]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:27:55 np0005546954 python3.9[110165]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:27:56 np0005546954 python3.9[110318]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Dec  5 07:27:57 np0005546954 python3.9[110471]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  5 07:27:58 np0005546954 python3.9[110629]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec  5 07:27:59 np0005546954 python3.9[110789]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  5 07:28:00 np0005546954 python3.9[110873]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  5 07:28:15 np0005546954 podman[111057]: 2025-12-05 12:28:15.562940859 +0000 UTC m=+0.059714981 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Dec  5 07:28:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:28:16.915 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:28:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:28:16.916 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:28:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:28:16.916 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:28:21 np0005546954 podman[111077]: 2025-12-05 12:28:21.588397949 +0000 UTC m=+0.093855594 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, io.buildah.version=1.41.3)
Dec  5 07:28:34 np0005546954 kernel: SELinux:  Converting 2758 SID table entries...
Dec  5 07:28:34 np0005546954 kernel: SELinux:  policy capability network_peer_controls=1
Dec  5 07:28:34 np0005546954 kernel: SELinux:  policy capability open_perms=1
Dec  5 07:28:34 np0005546954 kernel: SELinux:  policy capability extended_socket_class=1
Dec  5 07:28:34 np0005546954 kernel: SELinux:  policy capability always_check_network=0
Dec  5 07:28:34 np0005546954 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  5 07:28:34 np0005546954 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  5 07:28:34 np0005546954 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  5 07:28:44 np0005546954 kernel: SELinux:  Converting 2758 SID table entries...
Dec  5 07:28:44 np0005546954 kernel: SELinux:  policy capability network_peer_controls=1
Dec  5 07:28:44 np0005546954 kernel: SELinux:  policy capability open_perms=1
Dec  5 07:28:44 np0005546954 kernel: SELinux:  policy capability extended_socket_class=1
Dec  5 07:28:44 np0005546954 kernel: SELinux:  policy capability always_check_network=0
Dec  5 07:28:44 np0005546954 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  5 07:28:44 np0005546954 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  5 07:28:44 np0005546954 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  5 07:28:46 np0005546954 dbus-broker-launch[769]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Dec  5 07:28:46 np0005546954 podman[111123]: 2025-12-05 12:28:46.5884146 +0000 UTC m=+0.083536353 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  5 07:28:52 np0005546954 podman[111143]: 2025-12-05 12:28:52.596158261 +0000 UTC m=+0.105895763 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  5 07:29:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:29:16.917 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:29:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:29:16.921 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:29:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:29:16.921 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:29:17 np0005546954 podman[124284]: 2025-12-05 12:29:17.592247524 +0000 UTC m=+0.090571314 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:29:23 np0005546954 podman[127976]: 2025-12-05 12:29:23.564059225 +0000 UTC m=+0.078059402 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Dec  5 07:29:36 np0005546954 kernel: SELinux:  Converting 2759 SID table entries...
Dec  5 07:29:36 np0005546954 kernel: SELinux:  policy capability network_peer_controls=1
Dec  5 07:29:36 np0005546954 kernel: SELinux:  policy capability open_perms=1
Dec  5 07:29:36 np0005546954 kernel: SELinux:  policy capability extended_socket_class=1
Dec  5 07:29:36 np0005546954 kernel: SELinux:  policy capability always_check_network=0
Dec  5 07:29:36 np0005546954 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  5 07:29:36 np0005546954 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  5 07:29:36 np0005546954 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  5 07:29:37 np0005546954 dbus-broker-launch[738]: Noticed file-system modification, trigger reload.
Dec  5 07:29:37 np0005546954 dbus-broker-launch[769]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Dec  5 07:29:37 np0005546954 dbus-broker-launch[738]: Noticed file-system modification, trigger reload.
Dec  5 07:29:45 np0005546954 systemd[1]: Stopping OpenSSH server daemon...
Dec  5 07:29:45 np0005546954 systemd[1]: sshd.service: Deactivated successfully.
Dec  5 07:29:45 np0005546954 systemd[1]: Stopped OpenSSH server daemon.
Dec  5 07:29:45 np0005546954 systemd[1]: sshd.service: Consumed 1.304s CPU time, read 32.0K from disk, written 0B to disk.
Dec  5 07:29:45 np0005546954 systemd[1]: Stopped target sshd-keygen.target.
Dec  5 07:29:45 np0005546954 systemd[1]: Stopping sshd-keygen.target...
Dec  5 07:29:45 np0005546954 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  5 07:29:45 np0005546954 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  5 07:29:45 np0005546954 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  5 07:29:45 np0005546954 systemd[1]: Reached target sshd-keygen.target.
Dec  5 07:29:45 np0005546954 systemd[1]: Starting OpenSSH server daemon...
Dec  5 07:29:45 np0005546954 systemd[1]: Started OpenSSH server daemon.
Dec  5 07:29:47 np0005546954 podman[128982]: 2025-12-05 12:29:47.76533264 +0000 UTC m=+0.115176750 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Dec  5 07:29:47 np0005546954 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  5 07:29:47 np0005546954 systemd[1]: Starting man-db-cache-update.service...
Dec  5 07:29:47 np0005546954 systemd[1]: Reloading.
Dec  5 07:29:47 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:29:47 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:29:48 np0005546954 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  5 07:29:52 np0005546954 python3.9[134016]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  5 07:29:52 np0005546954 systemd[1]: Reloading.
Dec  5 07:29:52 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:29:52 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:29:53 np0005546954 python3.9[135386]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  5 07:29:53 np0005546954 systemd[1]: Reloading.
Dec  5 07:29:53 np0005546954 podman[135728]: 2025-12-05 12:29:53.883876029 +0000 UTC m=+0.109273601 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller)
Dec  5 07:29:53 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:29:53 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:29:54 np0005546954 python3.9[136833]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  5 07:29:54 np0005546954 systemd[1]: Reloading.
Dec  5 07:29:54 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:29:54 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:29:55 np0005546954 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  5 07:29:55 np0005546954 systemd[1]: Finished man-db-cache-update.service.
Dec  5 07:29:55 np0005546954 systemd[1]: man-db-cache-update.service: Consumed 10.049s CPU time.
Dec  5 07:29:55 np0005546954 systemd[1]: run-r69a1dadd6dc94bdabcc34dc70f897e33.service: Deactivated successfully.
Dec  5 07:29:55 np0005546954 python3.9[138184]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  5 07:29:55 np0005546954 systemd[1]: Reloading.
Dec  5 07:29:56 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:29:56 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:29:57 np0005546954 python3.9[138379]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  5 07:29:57 np0005546954 systemd[1]: Reloading.
Dec  5 07:29:57 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:29:57 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:29:58 np0005546954 python3.9[138569]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  5 07:29:58 np0005546954 systemd[1]: Reloading.
Dec  5 07:29:58 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:29:58 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:29:59 np0005546954 python3.9[138760]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  5 07:29:59 np0005546954 systemd[1]: Reloading.
Dec  5 07:29:59 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:29:59 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:30:00 np0005546954 python3.9[138949]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  5 07:30:01 np0005546954 python3.9[139104]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  5 07:30:01 np0005546954 systemd[1]: Reloading.
Dec  5 07:30:01 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:30:01 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:30:02 np0005546954 python3.9[139294]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  5 07:30:03 np0005546954 systemd[1]: Reloading.
Dec  5 07:30:03 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:30:03 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:30:03 np0005546954 systemd[1]: Listening on libvirt proxy daemon socket.
Dec  5 07:30:03 np0005546954 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Dec  5 07:30:04 np0005546954 python3.9[139487]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  5 07:30:05 np0005546954 python3.9[139642]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  5 07:30:06 np0005546954 python3.9[139797]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  5 07:30:06 np0005546954 python3.9[139952]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  5 07:30:07 np0005546954 python3.9[140107]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  5 07:30:08 np0005546954 python3.9[140262]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  5 07:30:09 np0005546954 python3.9[140417]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  5 07:30:10 np0005546954 python3.9[140572]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  5 07:30:11 np0005546954 python3.9[140727]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  5 07:30:11 np0005546954 python3.9[140882]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  5 07:30:12 np0005546954 python3.9[141037]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  5 07:30:13 np0005546954 python3.9[141192]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  5 07:30:14 np0005546954 python3.9[141347]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  5 07:30:16 np0005546954 python3.9[141502]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  5 07:30:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:30:16.919 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:30:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:30:16.924 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:30:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:30:16.924 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:30:17 np0005546954 podman[141629]: 2025-12-05 12:30:17.875186394 +0000 UTC m=+0.064524001 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec  5 07:30:18 np0005546954 python3.9[141670]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:30:18 np0005546954 python3.9[141825]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:30:19 np0005546954 python3.9[141977]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:30:19 np0005546954 python3.9[142129]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:30:20 np0005546954 python3.9[142281]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:30:21 np0005546954 python3.9[142433]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:30:22 np0005546954 python3.9[142585]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:30:22 np0005546954 python3.9[142710]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764937821.4353063-1089-274187989554953/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:30:23 np0005546954 python3.9[142862]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:30:23 np0005546954 python3.9[142987]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764937822.931483-1089-56118571601122/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:30:24 np0005546954 podman[143111]: 2025-12-05 12:30:24.493071624 +0000 UTC m=+0.098251486 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Dec  5 07:30:24 np0005546954 python3.9[143156]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:30:25 np0005546954 python3.9[143289]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764937824.1396856-1089-249837734780635/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:30:25 np0005546954 python3.9[143441]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:30:26 np0005546954 python3.9[143566]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764937825.5050137-1089-169910307520217/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:30:27 np0005546954 python3.9[143718]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:30:27 np0005546954 python3.9[143843]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764937826.7344496-1089-199564419605233/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:30:28 np0005546954 python3.9[143995]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:30:29 np0005546954 python3.9[144120]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764937828.0469792-1089-280425200943672/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:30:30 np0005546954 python3.9[144272]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:30:30 np0005546954 python3.9[144395]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764937829.5151958-1089-97193571927090/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:30:31 np0005546954 python3.9[144547]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:30:31 np0005546954 python3.9[144672]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764937830.7966855-1089-249652375658731/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:30:32 np0005546954 python3.9[144824]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Dec  5 07:30:33 np0005546954 python3.9[144977]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:30:34 np0005546954 python3.9[145129]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:30:35 np0005546954 python3.9[145281]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:30:35 np0005546954 python3.9[145433]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:30:36 np0005546954 python3.9[145585]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:30:37 np0005546954 python3.9[145737]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:30:37 np0005546954 python3.9[145889]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:30:38 np0005546954 python3.9[146041]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:30:39 np0005546954 python3.9[146193]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:30:39 np0005546954 python3.9[146345]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:30:40 np0005546954 python3.9[146497]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:30:41 np0005546954 python3.9[146649]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:30:42 np0005546954 python3.9[146801]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:30:42 np0005546954 python3.9[146953]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:30:43 np0005546954 python3.9[147105]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:30:44 np0005546954 python3.9[147228]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937842.9819424-1531-205880526027168/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:30:44 np0005546954 python3.9[147380]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:30:45 np0005546954 python3.9[147503]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937844.3996077-1531-168633006300905/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:30:46 np0005546954 python3.9[147655]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:30:46 np0005546954 python3.9[147778]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937845.7546918-1531-62674023325175/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:30:47 np0005546954 python3.9[147930]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:30:48 np0005546954 podman[148053]: 2025-12-05 12:30:48.036491037 +0000 UTC m=+0.059525150 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec  5 07:30:48 np0005546954 python3.9[148054]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937847.028425-1531-57773731659560/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:30:48 np0005546954 python3.9[148224]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:30:49 np0005546954 python3.9[148347]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937848.473639-1531-50880533562879/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:30:50 np0005546954 python3.9[148499]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:30:50 np0005546954 python3.9[148622]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937849.6667397-1531-233033849651050/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:30:51 np0005546954 python3.9[148774]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:30:51 np0005546954 python3.9[148897]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937850.919591-1531-115956004327194/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:30:52 np0005546954 python3.9[149049]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:30:53 np0005546954 python3.9[149172]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937852.063051-1531-187093276641667/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:30:53 np0005546954 python3.9[149324]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:30:54 np0005546954 python3.9[149447]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937853.432812-1531-62111107378393/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:30:55 np0005546954 podman[149571]: 2025-12-05 12:30:55.080484303 +0000 UTC m=+0.104726744 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  5 07:30:55 np0005546954 python3.9[149616]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:30:55 np0005546954 python3.9[149746]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937854.686832-1531-119157643056688/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:30:56 np0005546954 python3.9[149898]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:30:57 np0005546954 python3.9[150021]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937855.972547-1531-22098770037516/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:30:57 np0005546954 python3.9[150173]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:30:58 np0005546954 python3.9[150296]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937857.4120548-1531-258561122828234/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:30:59 np0005546954 python3.9[150448]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:30:59 np0005546954 python3.9[150571]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937858.68664-1531-186754758089886/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:31:00 np0005546954 python3.9[150723]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:31:01 np0005546954 python3.9[150846]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937859.935516-1531-269796169792331/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:31:01 np0005546954 python3.9[150996]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:31:02 np0005546954 python3.9[151151]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Dec  5 07:31:05 np0005546954 dbus-broker-launch[769]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Dec  5 07:31:05 np0005546954 python3.9[151307]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:31:06 np0005546954 python3.9[151459]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:31:06 np0005546954 python3.9[151611]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:31:07 np0005546954 python3.9[151763]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:31:08 np0005546954 python3.9[151915]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:31:08 np0005546954 python3.9[152067]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:31:09 np0005546954 python3.9[152219]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:31:10 np0005546954 python3.9[152371]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:31:10 np0005546954 python3.9[152523]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:31:11 np0005546954 python3.9[152675]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:31:12 np0005546954 python3.9[152827]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  5 07:31:12 np0005546954 systemd[1]: Reloading.
Dec  5 07:31:12 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:31:12 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:31:12 np0005546954 systemd[1]: Starting libvirt logging daemon socket...
Dec  5 07:31:12 np0005546954 systemd[1]: Listening on libvirt logging daemon socket.
Dec  5 07:31:12 np0005546954 systemd[1]: Starting libvirt logging daemon admin socket...
Dec  5 07:31:12 np0005546954 systemd[1]: Listening on libvirt logging daemon admin socket.
Dec  5 07:31:12 np0005546954 systemd[1]: Starting libvirt logging daemon...
Dec  5 07:31:12 np0005546954 systemd[1]: Started libvirt logging daemon.
Dec  5 07:31:13 np0005546954 python3.9[153020]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  5 07:31:13 np0005546954 systemd[1]: Reloading.
Dec  5 07:31:13 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:31:13 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:31:14 np0005546954 systemd[1]: Starting libvirt nodedev daemon socket...
Dec  5 07:31:14 np0005546954 systemd[1]: Listening on libvirt nodedev daemon socket.
Dec  5 07:31:14 np0005546954 systemd[1]: Starting libvirt nodedev daemon admin socket...
Dec  5 07:31:14 np0005546954 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Dec  5 07:31:14 np0005546954 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Dec  5 07:31:14 np0005546954 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Dec  5 07:31:14 np0005546954 systemd[1]: Starting libvirt nodedev daemon...
Dec  5 07:31:14 np0005546954 systemd[1]: Started libvirt nodedev daemon.
Dec  5 07:31:14 np0005546954 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Dec  5 07:31:14 np0005546954 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Dec  5 07:31:14 np0005546954 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Dec  5 07:31:14 np0005546954 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Dec  5 07:31:14 np0005546954 python3.9[153237]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  5 07:31:14 np0005546954 systemd[1]: Reloading.
Dec  5 07:31:14 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:31:14 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:31:15 np0005546954 systemd[1]: Starting libvirt proxy daemon admin socket...
Dec  5 07:31:15 np0005546954 systemd[1]: Starting libvirt proxy daemon read-only socket...
Dec  5 07:31:15 np0005546954 systemd[1]: Listening on libvirt proxy daemon admin socket.
Dec  5 07:31:15 np0005546954 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Dec  5 07:31:15 np0005546954 systemd[1]: Starting libvirt proxy daemon...
Dec  5 07:31:15 np0005546954 systemd[1]: Started libvirt proxy daemon.
Dec  5 07:31:15 np0005546954 setroubleshoot[153081]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 55c77deb-33db-4705-904d-12a0a1bb9f42
Dec  5 07:31:15 np0005546954 setroubleshoot[153081]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Dec  5 07:31:15 np0005546954 setroubleshoot[153081]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 55c77deb-33db-4705-904d-12a0a1bb9f42
Dec  5 07:31:15 np0005546954 setroubleshoot[153081]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Dec  5 07:31:15 np0005546954 python3.9[153457]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  5 07:31:15 np0005546954 systemd[1]: Reloading.
Dec  5 07:31:16 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:31:16 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:31:16 np0005546954 systemd[1]: Listening on libvirt locking daemon socket.
Dec  5 07:31:16 np0005546954 systemd[1]: Starting libvirt QEMU daemon socket...
Dec  5 07:31:16 np0005546954 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Dec  5 07:31:16 np0005546954 systemd[1]: Starting Virtual Machine and Container Registration Service...
Dec  5 07:31:16 np0005546954 systemd[1]: Listening on libvirt QEMU daemon socket.
Dec  5 07:31:16 np0005546954 systemd[1]: Starting libvirt QEMU daemon admin socket...
Dec  5 07:31:16 np0005546954 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Dec  5 07:31:16 np0005546954 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Dec  5 07:31:16 np0005546954 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Dec  5 07:31:16 np0005546954 systemd[1]: Started Virtual Machine and Container Registration Service.
Dec  5 07:31:16 np0005546954 systemd[1]: Starting libvirt QEMU daemon...
Dec  5 07:31:16 np0005546954 systemd[1]: Started libvirt QEMU daemon.
Dec  5 07:31:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:31:16.921 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:31:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:31:16.925 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:31:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:31:16.925 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:31:17 np0005546954 python3.9[153672]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  5 07:31:17 np0005546954 systemd[1]: Reloading.
Dec  5 07:31:17 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:31:17 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:31:17 np0005546954 systemd[1]: Starting libvirt secret daemon socket...
Dec  5 07:31:17 np0005546954 systemd[1]: Listening on libvirt secret daemon socket.
Dec  5 07:31:17 np0005546954 systemd[1]: Starting libvirt secret daemon admin socket...
Dec  5 07:31:17 np0005546954 systemd[1]: Starting libvirt secret daemon read-only socket...
Dec  5 07:31:17 np0005546954 systemd[1]: Listening on libvirt secret daemon admin socket.
Dec  5 07:31:17 np0005546954 systemd[1]: Listening on libvirt secret daemon read-only socket.
Dec  5 07:31:17 np0005546954 systemd[1]: Starting libvirt secret daemon...
Dec  5 07:31:17 np0005546954 systemd[1]: Started libvirt secret daemon.
Dec  5 07:31:18 np0005546954 podman[153856]: 2025-12-05 12:31:18.255165481 +0000 UTC m=+0.078109405 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  5 07:31:18 np0005546954 python3.9[153903]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:31:19 np0005546954 python3.9[154055]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  5 07:31:20 np0005546954 python3.9[154207]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:31:20 np0005546954 python3.9[154330]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764937879.6051235-2221-1126121084525/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:31:21 np0005546954 python3.9[154482]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:31:22 np0005546954 python3.9[154634]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:31:22 np0005546954 python3.9[154712]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:31:23 np0005546954 python3.9[154865]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:31:23 np0005546954 python3.9[154943]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.k4bxkk2z recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:31:24 np0005546954 python3.9[155095]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:31:25 np0005546954 python3.9[155173]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:31:25 np0005546954 podman[155255]: 2025-12-05 12:31:25.582983131 +0000 UTC m=+0.086497530 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec  5 07:31:25 np0005546954 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Dec  5 07:31:25 np0005546954 systemd[1]: setroubleshootd.service: Deactivated successfully.
Dec  5 07:31:25 np0005546954 python3.9[155351]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:31:26 np0005546954 python3[155504]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec  5 07:31:27 np0005546954 python3.9[155656]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:31:27 np0005546954 python3.9[155734]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:31:28 np0005546954 python3.9[155886]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:31:29 np0005546954 python3.9[155964]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:31:29 np0005546954 python3.9[156116]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:31:30 np0005546954 python3.9[156194]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:31:30 np0005546954 python3.9[156346]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:31:31 np0005546954 python3.9[156424]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:31:32 np0005546954 python3.9[156576]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:31:32 np0005546954 python3.9[156701]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764937891.4907799-2471-21346336590776/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:31:33 np0005546954 python3.9[156853]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:31:33 np0005546954 python3.9[157005]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:31:34 np0005546954 python3.9[157160]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:31:35 np0005546954 python3.9[157312]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:31:36 np0005546954 python3.9[157465]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 07:31:37 np0005546954 python3.9[157619]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:31:37 np0005546954 python3.9[157774]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:31:38 np0005546954 python3.9[157926]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:31:39 np0005546954 python3.9[158049]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764937898.2014484-2615-46638302143359/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:31:40 np0005546954 python3.9[158201]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:31:40 np0005546954 python3.9[158324]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764937899.5679975-2647-230454178377744/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:31:41 np0005546954 python3.9[158476]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:31:41 np0005546954 python3.9[158599]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764937900.8050995-2675-80975836230323/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:31:42 np0005546954 python3.9[158751]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 07:31:42 np0005546954 systemd[1]: Reloading.
Dec  5 07:31:42 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:31:42 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:31:42 np0005546954 systemd[1]: Reached target edpm_libvirt.target.
Dec  5 07:31:43 np0005546954 python3.9[158941]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec  5 07:31:43 np0005546954 systemd[1]: Reloading.
Dec  5 07:31:43 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:31:43 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:31:44 np0005546954 systemd[1]: Reloading.
Dec  5 07:31:44 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:31:44 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:31:44 np0005546954 systemd[1]: session-24.scope: Deactivated successfully.
Dec  5 07:31:44 np0005546954 systemd[1]: session-24.scope: Consumed 3min 24.177s CPU time.
Dec  5 07:31:44 np0005546954 systemd-logind[789]: Session 24 logged out. Waiting for processes to exit.
Dec  5 07:31:44 np0005546954 systemd-logind[789]: Removed session 24.
Dec  5 07:31:48 np0005546954 podman[159037]: 2025-12-05 12:31:48.574933592 +0000 UTC m=+0.075287027 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec  5 07:31:50 np0005546954 systemd-logind[789]: New session 25 of user zuul.
Dec  5 07:31:50 np0005546954 systemd[1]: Started Session 25 of User zuul.
Dec  5 07:31:51 np0005546954 python3.9[159209]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 07:31:52 np0005546954 python3.9[159363]: ansible-ansible.builtin.service_facts Invoked
Dec  5 07:31:52 np0005546954 network[159380]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  5 07:31:52 np0005546954 network[159381]: 'network-scripts' will be removed from distribution in near future.
Dec  5 07:31:52 np0005546954 network[159382]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  5 07:31:55 np0005546954 podman[159494]: 2025-12-05 12:31:55.741726822 +0000 UTC m=+0.102309638 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  5 07:31:56 np0005546954 python3.9[159680]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  5 07:31:57 np0005546954 python3.9[159764]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  5 07:32:04 np0005546954 python3.9[159917]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 07:32:05 np0005546954 python3.9[160069]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:32:05 np0005546954 python3.9[160222]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 07:32:06 np0005546954 python3.9[160374]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:32:07 np0005546954 python3.9[160527]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:32:07 np0005546954 python3.9[160650]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764937926.762846-171-129584190304098/.source.iscsi _original_basename=.as1h79zf follow=False checksum=81df0f7258b2c730a646d5d4325f18caad7b077f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:32:08 np0005546954 python3.9[160802]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:32:09 np0005546954 python3.9[160954]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:32:09 np0005546954 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  5 07:32:09 np0005546954 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  5 07:32:10 np0005546954 python3.9[161107]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 07:32:11 np0005546954 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Dec  5 07:32:11 np0005546954 python3.9[161263]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 07:32:11 np0005546954 systemd[1]: Reloading.
Dec  5 07:32:12 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:32:12 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:32:12 np0005546954 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Dec  5 07:32:12 np0005546954 systemd[1]: Starting Open-iSCSI...
Dec  5 07:32:12 np0005546954 kernel: Loading iSCSI transport class v2.0-870.
Dec  5 07:32:12 np0005546954 systemd[1]: Started Open-iSCSI.
Dec  5 07:32:12 np0005546954 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Dec  5 07:32:12 np0005546954 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Dec  5 07:32:13 np0005546954 python3.9[161464]: ansible-ansible.builtin.service_facts Invoked
Dec  5 07:32:13 np0005546954 network[161481]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  5 07:32:13 np0005546954 network[161482]: 'network-scripts' will be removed from distribution in near future.
Dec  5 07:32:13 np0005546954 network[161483]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  5 07:32:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:32:16.922 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:32:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:32:16.924 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:32:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:32:16.924 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:32:18 np0005546954 python3.9[161754]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec  5 07:32:19 np0005546954 podman[161878]: 2025-12-05 12:32:19.3660682 +0000 UTC m=+0.064012924 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec  5 07:32:19 np0005546954 python3.9[161921]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Dec  5 07:32:20 np0005546954 python3.9[162081]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:32:20 np0005546954 python3.9[162204]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764937939.8363423-325-29367377085345/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:32:21 np0005546954 python3.9[162356]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:32:22 np0005546954 python3.9[162508]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  5 07:32:22 np0005546954 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec  5 07:32:22 np0005546954 systemd[1]: Stopped Load Kernel Modules.
Dec  5 07:32:22 np0005546954 systemd[1]: Stopping Load Kernel Modules...
Dec  5 07:32:22 np0005546954 systemd[1]: Starting Load Kernel Modules...
Dec  5 07:32:22 np0005546954 systemd[1]: Finished Load Kernel Modules.
Dec  5 07:32:23 np0005546954 python3.9[162664]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:32:24 np0005546954 python3.9[162816]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 07:32:25 np0005546954 python3.9[162968]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 07:32:25 np0005546954 podman[163120]: 2025-12-05 12:32:25.919580661 +0000 UTC m=+0.100046357 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3)
Dec  5 07:32:26 np0005546954 python3.9[163121]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:32:26 np0005546954 python3.9[163267]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764937945.473148-441-160800814048809/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:32:27 np0005546954 python3.9[163419]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:32:28 np0005546954 python3.9[163572]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:32:28 np0005546954 python3.9[163724]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:32:29 np0005546954 python3.9[163876]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:32:30 np0005546954 python3.9[164028]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:32:31 np0005546954 python3.9[164180]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:32:31 np0005546954 python3.9[164332]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:32:32 np0005546954 python3.9[164484]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:32:33 np0005546954 python3.9[164636]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 07:32:34 np0005546954 python3.9[164790]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:32:34 np0005546954 python3.9[164942]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:32:35 np0005546954 python3.9[165094]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:32:36 np0005546954 python3.9[165172]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:32:36 np0005546954 python3.9[165324]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:32:37 np0005546954 python3.9[165402]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:32:37 np0005546954 python3.9[165554]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:32:38 np0005546954 python3.9[165706]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:32:39 np0005546954 python3.9[165784]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:32:39 np0005546954 python3.9[165936]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:32:40 np0005546954 python3.9[166014]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:32:41 np0005546954 python3.9[166166]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 07:32:41 np0005546954 systemd[1]: Reloading.
Dec  5 07:32:41 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:32:41 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:32:42 np0005546954 python3.9[166355]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:32:43 np0005546954 python3.9[166433]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:32:43 np0005546954 python3.9[166585]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:32:44 np0005546954 python3.9[166663]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:32:45 np0005546954 python3.9[166815]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 07:32:45 np0005546954 systemd[1]: Reloading.
Dec  5 07:32:45 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:32:45 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:32:45 np0005546954 systemd[1]: Starting Create netns directory...
Dec  5 07:32:45 np0005546954 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec  5 07:32:45 np0005546954 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec  5 07:32:45 np0005546954 systemd[1]: Finished Create netns directory.
Dec  5 07:32:46 np0005546954 python3.9[167008]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:32:47 np0005546954 python3.9[167160]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:32:47 np0005546954 python3.9[167283]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764937966.6788614-855-171226539540317/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:32:48 np0005546954 python3.9[167435]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:32:49 np0005546954 podman[167559]: 2025-12-05 12:32:49.571647121 +0000 UTC m=+0.102768817 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  5 07:32:49 np0005546954 python3.9[167598]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:32:50 np0005546954 python3.9[167729]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764937969.1602902-905-141629924695416/.source.json _original_basename=.d3xabhv4 follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:32:51 np0005546954 python3.9[167881]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:32:53 np0005546954 python3.9[168308]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Dec  5 07:32:54 np0005546954 python3.9[168460]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  5 07:32:55 np0005546954 python3.9[168612]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec  5 07:32:56 np0005546954 podman[168663]: 2025-12-05 12:32:56.618831449 +0000 UTC m=+0.125805259 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec  5 07:32:57 np0005546954 python3[168816]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec  5 07:32:57 np0005546954 podman[168850]: 2025-12-05 12:32:57.744368907 +0000 UTC m=+0.080480518 container create bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Dec  5 07:32:57 np0005546954 podman[168850]: 2025-12-05 12:32:57.704551453 +0000 UTC m=+0.040663114 image pull 9af6aa52ee187025bc25565b66d3eefb486acac26f9281e33f4cce76a40d21f7 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec  5 07:32:57 np0005546954 python3[168816]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec  5 07:32:58 np0005546954 python3.9[169041]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 07:32:59 np0005546954 python3.9[169195]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:33:00 np0005546954 python3.9[169271]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 07:33:00 np0005546954 python3.9[169422]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764937980.1714668-1081-274497241492975/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:33:01 np0005546954 python3.9[169498]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  5 07:33:01 np0005546954 systemd[1]: Reloading.
Dec  5 07:33:01 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:33:01 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:33:02 np0005546954 python3.9[169609]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 07:33:02 np0005546954 systemd[1]: Reloading.
Dec  5 07:33:02 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:33:02 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:33:02 np0005546954 systemd[1]: Starting multipathd container...
Dec  5 07:33:03 np0005546954 systemd[1]: Started libcrun container.
Dec  5 07:33:03 np0005546954 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3f361f72789831de3179114bc14e36b8f1d544d28cc58b08682a6900a06d118c/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec  5 07:33:03 np0005546954 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3f361f72789831de3179114bc14e36b8f1d544d28cc58b08682a6900a06d118c/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec  5 07:33:03 np0005546954 systemd[1]: Started /usr/bin/podman healthcheck run bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d.
Dec  5 07:33:03 np0005546954 podman[169649]: 2025-12-05 12:33:03.13740886 +0000 UTC m=+0.170657365 container init bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec  5 07:33:03 np0005546954 multipathd[169663]: + sudo -E kolla_set_configs
Dec  5 07:33:03 np0005546954 podman[169649]: 2025-12-05 12:33:03.163490959 +0000 UTC m=+0.196739434 container start bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  5 07:33:03 np0005546954 podman[169649]: multipathd
Dec  5 07:33:03 np0005546954 systemd[1]: Started multipathd container.
Dec  5 07:33:03 np0005546954 podman[169669]: 2025-12-05 12:33:03.243372567 +0000 UTC m=+0.066814684 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Dec  5 07:33:03 np0005546954 multipathd[169663]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  5 07:33:03 np0005546954 multipathd[169663]: INFO:__main__:Validating config file
Dec  5 07:33:03 np0005546954 multipathd[169663]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  5 07:33:03 np0005546954 multipathd[169663]: INFO:__main__:Writing out command to execute
Dec  5 07:33:03 np0005546954 systemd[1]: bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d-5f59741fec86e467.service: Main process exited, code=exited, status=1/FAILURE
Dec  5 07:33:03 np0005546954 systemd[1]: bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d-5f59741fec86e467.service: Failed with result 'exit-code'.
Dec  5 07:33:03 np0005546954 multipathd[169663]: ++ cat /run_command
Dec  5 07:33:03 np0005546954 multipathd[169663]: + CMD='/usr/sbin/multipathd -d'
Dec  5 07:33:03 np0005546954 multipathd[169663]: + ARGS=
Dec  5 07:33:03 np0005546954 multipathd[169663]: + sudo kolla_copy_cacerts
Dec  5 07:33:03 np0005546954 multipathd[169663]: + [[ ! -n '' ]]
Dec  5 07:33:03 np0005546954 multipathd[169663]: + . kolla_extend_start
Dec  5 07:33:03 np0005546954 multipathd[169663]: Running command: '/usr/sbin/multipathd -d'
Dec  5 07:33:03 np0005546954 multipathd[169663]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec  5 07:33:03 np0005546954 multipathd[169663]: + umask 0022
Dec  5 07:33:03 np0005546954 multipathd[169663]: + exec /usr/sbin/multipathd -d
Dec  5 07:33:03 np0005546954 multipathd[169663]: 2989.999117 | --------start up--------
Dec  5 07:33:03 np0005546954 multipathd[169663]: 2989.999143 | read /etc/multipath.conf
Dec  5 07:33:03 np0005546954 multipathd[169663]: 2990.009952 | path checkers start up
Dec  5 07:33:03 np0005546954 python3.9[169851]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 07:33:04 np0005546954 python3.9[170005]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:33:05 np0005546954 python3.9[170170]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  5 07:33:05 np0005546954 systemd[1]: Stopping multipathd container...
Dec  5 07:33:05 np0005546954 multipathd[169663]: 2992.468328 | exit (signal)
Dec  5 07:33:05 np0005546954 multipathd[169663]: 2992.468389 | --------shut down-------
Dec  5 07:33:05 np0005546954 systemd[1]: libpod-bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d.scope: Deactivated successfully.
Dec  5 07:33:05 np0005546954 podman[170174]: 2025-12-05 12:33:05.81437609 +0000 UTC m=+0.082614827 container died bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd)
Dec  5 07:33:05 np0005546954 systemd[1]: bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d-5f59741fec86e467.timer: Deactivated successfully.
Dec  5 07:33:05 np0005546954 systemd[1]: Stopped /usr/bin/podman healthcheck run bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d.
Dec  5 07:33:05 np0005546954 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d-userdata-shm.mount: Deactivated successfully.
Dec  5 07:33:05 np0005546954 systemd[1]: var-lib-containers-storage-overlay-3f361f72789831de3179114bc14e36b8f1d544d28cc58b08682a6900a06d118c-merged.mount: Deactivated successfully.
Dec  5 07:33:05 np0005546954 podman[170174]: 2025-12-05 12:33:05.860517646 +0000 UTC m=+0.128756333 container cleanup bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd)
Dec  5 07:33:05 np0005546954 podman[170174]: multipathd
Dec  5 07:33:05 np0005546954 podman[170203]: multipathd
Dec  5 07:33:05 np0005546954 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Dec  5 07:33:05 np0005546954 systemd[1]: Stopped multipathd container.
Dec  5 07:33:05 np0005546954 systemd[1]: Starting multipathd container...
Dec  5 07:33:06 np0005546954 systemd[1]: Started libcrun container.
Dec  5 07:33:06 np0005546954 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3f361f72789831de3179114bc14e36b8f1d544d28cc58b08682a6900a06d118c/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec  5 07:33:06 np0005546954 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3f361f72789831de3179114bc14e36b8f1d544d28cc58b08682a6900a06d118c/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec  5 07:33:06 np0005546954 systemd[1]: Started /usr/bin/podman healthcheck run bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d.
Dec  5 07:33:06 np0005546954 podman[170216]: 2025-12-05 12:33:06.132449767 +0000 UTC m=+0.130739845 container init bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  5 07:33:06 np0005546954 multipathd[170231]: + sudo -E kolla_set_configs
Dec  5 07:33:06 np0005546954 podman[170216]: 2025-12-05 12:33:06.156255104 +0000 UTC m=+0.154545152 container start bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Dec  5 07:33:06 np0005546954 podman[170216]: multipathd
Dec  5 07:33:06 np0005546954 systemd[1]: Started multipathd container.
Dec  5 07:33:06 np0005546954 multipathd[170231]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  5 07:33:06 np0005546954 multipathd[170231]: INFO:__main__:Validating config file
Dec  5 07:33:06 np0005546954 multipathd[170231]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  5 07:33:06 np0005546954 multipathd[170231]: INFO:__main__:Writing out command to execute
Dec  5 07:33:06 np0005546954 podman[170238]: 2025-12-05 12:33:06.236545475 +0000 UTC m=+0.066990329 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:33:06 np0005546954 multipathd[170231]: ++ cat /run_command
Dec  5 07:33:06 np0005546954 systemd[1]: bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d-644e6c375fa537d1.service: Main process exited, code=exited, status=1/FAILURE
Dec  5 07:33:06 np0005546954 systemd[1]: bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d-644e6c375fa537d1.service: Failed with result 'exit-code'.
Dec  5 07:33:06 np0005546954 multipathd[170231]: + CMD='/usr/sbin/multipathd -d'
Dec  5 07:33:06 np0005546954 multipathd[170231]: + ARGS=
Dec  5 07:33:06 np0005546954 multipathd[170231]: + sudo kolla_copy_cacerts
Dec  5 07:33:06 np0005546954 multipathd[170231]: + [[ ! -n '' ]]
Dec  5 07:33:06 np0005546954 multipathd[170231]: + . kolla_extend_start
Dec  5 07:33:06 np0005546954 multipathd[170231]: Running command: '/usr/sbin/multipathd -d'
Dec  5 07:33:06 np0005546954 multipathd[170231]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec  5 07:33:06 np0005546954 multipathd[170231]: + umask 0022
Dec  5 07:33:06 np0005546954 multipathd[170231]: + exec /usr/sbin/multipathd -d
Dec  5 07:33:06 np0005546954 multipathd[170231]: 2992.972629 | --------start up--------
Dec  5 07:33:06 np0005546954 multipathd[170231]: 2992.972651 | read /etc/multipath.conf
Dec  5 07:33:06 np0005546954 multipathd[170231]: 2992.980454 | path checkers start up
Dec  5 07:33:07 np0005546954 python3.9[170422]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:33:08 np0005546954 python3.9[170574]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec  5 07:33:09 np0005546954 python3.9[170726]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Dec  5 07:33:09 np0005546954 kernel: Key type psk registered
Dec  5 07:33:10 np0005546954 python3.9[170887]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:33:10 np0005546954 python3.9[171010]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764937989.703486-1241-208471595157084/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:33:12 np0005546954 python3.9[171162]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:33:13 np0005546954 python3.9[171314]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  5 07:33:13 np0005546954 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec  5 07:33:13 np0005546954 systemd[1]: Stopped Load Kernel Modules.
Dec  5 07:33:13 np0005546954 systemd[1]: Stopping Load Kernel Modules...
Dec  5 07:33:13 np0005546954 systemd[1]: Starting Load Kernel Modules...
Dec  5 07:33:13 np0005546954 systemd[1]: Finished Load Kernel Modules.
Dec  5 07:33:14 np0005546954 python3.9[171470]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  5 07:33:14 np0005546954 systemd[1]: virtnodedevd.service: Deactivated successfully.
Dec  5 07:33:15 np0005546954 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec  5 07:33:16 np0005546954 systemd[1]: virtqemud.service: Deactivated successfully.
Dec  5 07:33:16 np0005546954 systemd[1]: Reloading.
Dec  5 07:33:16 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:33:16 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:33:16 np0005546954 systemd[1]: Reloading.
Dec  5 07:33:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:33:16.925 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:33:16 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:33:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:33:16.928 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:33:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:33:16.928 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:33:16 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:33:17 np0005546954 systemd-logind[789]: Watching system buttons on /dev/input/event0 (Power Button)
Dec  5 07:33:17 np0005546954 systemd-logind[789]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec  5 07:33:17 np0005546954 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  5 07:33:17 np0005546954 systemd[1]: Starting man-db-cache-update.service...
Dec  5 07:33:17 np0005546954 systemd[1]: Reloading.
Dec  5 07:33:17 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:33:17 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:33:17 np0005546954 systemd[1]: virtsecretd.service: Deactivated successfully.
Dec  5 07:33:17 np0005546954 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  5 07:33:19 np0005546954 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  5 07:33:19 np0005546954 systemd[1]: Finished man-db-cache-update.service.
Dec  5 07:33:19 np0005546954 systemd[1]: man-db-cache-update.service: Consumed 1.840s CPU time.
Dec  5 07:33:19 np0005546954 systemd[1]: run-r9327d054d31043f78b473bfbe2f5e1dd.service: Deactivated successfully.
Dec  5 07:33:19 np0005546954 python3.9[172874]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  5 07:33:19 np0005546954 systemd[1]: Stopping Open-iSCSI...
Dec  5 07:33:19 np0005546954 iscsid[161304]: iscsid shutting down.
Dec  5 07:33:19 np0005546954 systemd[1]: iscsid.service: Deactivated successfully.
Dec  5 07:33:19 np0005546954 systemd[1]: Stopped Open-iSCSI.
Dec  5 07:33:19 np0005546954 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Dec  5 07:33:19 np0005546954 systemd[1]: Starting Open-iSCSI...
Dec  5 07:33:19 np0005546954 systemd[1]: Started Open-iSCSI.
Dec  5 07:33:19 np0005546954 podman[173054]: 2025-12-05 12:33:19.971212141 +0000 UTC m=+0.078730383 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:33:20 np0005546954 python3.9[173094]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 07:33:21 np0005546954 python3.9[173254]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:33:22 np0005546954 python3.9[173406]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  5 07:33:22 np0005546954 systemd[1]: Reloading.
Dec  5 07:33:22 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:33:22 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:33:23 np0005546954 python3.9[173590]: ansible-ansible.builtin.service_facts Invoked
Dec  5 07:33:23 np0005546954 network[173607]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  5 07:33:23 np0005546954 network[173608]: 'network-scripts' will be removed from distribution in near future.
Dec  5 07:33:23 np0005546954 network[173609]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  5 07:33:26 np0005546954 podman[173709]: 2025-12-05 12:33:26.843510471 +0000 UTC m=+0.154295914 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller)
Dec  5 07:33:28 np0005546954 python3.9[173911]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 07:33:29 np0005546954 python3.9[174064]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 07:33:29 np0005546954 python3.9[174217]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 07:33:30 np0005546954 python3.9[174370]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 07:33:31 np0005546954 python3.9[174523]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 07:33:32 np0005546954 python3.9[174676]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 07:33:33 np0005546954 python3.9[174829]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 07:33:34 np0005546954 python3.9[174982]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 07:33:35 np0005546954 python3.9[175135]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:33:35 np0005546954 python3.9[175287]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:33:36 np0005546954 python3.9[175439]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:33:36 np0005546954 podman[175485]: 2025-12-05 12:33:36.548035547 +0000 UTC m=+0.060580386 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:33:37 np0005546954 python3.9[175611]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:33:37 np0005546954 python3.9[175763]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:33:38 np0005546954 python3.9[175915]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:33:39 np0005546954 python3.9[176067]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:33:39 np0005546954 python3.9[176219]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:33:40 np0005546954 python3.9[176371]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:33:41 np0005546954 python3.9[176523]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:33:41 np0005546954 python3.9[176675]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:33:42 np0005546954 python3.9[176827]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:33:43 np0005546954 python3.9[176979]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:33:44 np0005546954 python3.9[177131]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:33:44 np0005546954 python3.9[177283]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:33:45 np0005546954 python3.9[177435]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:33:46 np0005546954 python3.9[177587]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:33:47 np0005546954 python3.9[177739]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  5 07:33:48 np0005546954 python3.9[177891]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  5 07:33:48 np0005546954 systemd[1]: Reloading.
Dec  5 07:33:48 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:33:48 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:33:49 np0005546954 python3.9[178079]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:33:50 np0005546954 python3.9[178232]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:33:50 np0005546954 podman[178234]: 2025-12-05 12:33:50.196681185 +0000 UTC m=+0.054889319 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent)
Dec  5 07:33:50 np0005546954 python3.9[178404]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:33:51 np0005546954 python3.9[178557]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:33:52 np0005546954 python3.9[178710]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:33:52 np0005546954 python3.9[178863]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:33:53 np0005546954 python3.9[179016]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:33:54 np0005546954 python3.9[179169]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:33:55 np0005546954 python3.9[179322]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:33:56 np0005546954 python3.9[179474]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:33:57 np0005546954 podman[179598]: 2025-12-05 12:33:57.209202865 +0000 UTC m=+0.082178952 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller)
Dec  5 07:33:57 np0005546954 python3.9[179644]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:33:58 np0005546954 python3.9[179805]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:33:58 np0005546954 python3.9[179957]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:33:59 np0005546954 python3.9[180109]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:34:00 np0005546954 python3.9[180261]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:34:00 np0005546954 python3.9[180413]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:34:01 np0005546954 python3.9[180565]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:34:02 np0005546954 python3.9[180717]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:34:07 np0005546954 podman[180841]: 2025-12-05 12:34:07.156012696 +0000 UTC m=+0.093451187 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec  5 07:34:07 np0005546954 python3.9[180885]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Dec  5 07:34:08 np0005546954 python3.9[181041]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  5 07:34:09 np0005546954 python3.9[181199]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec  5 07:34:10 np0005546954 systemd-logind[789]: New session 26 of user zuul.
Dec  5 07:34:10 np0005546954 systemd[1]: Started Session 26 of User zuul.
Dec  5 07:34:10 np0005546954 systemd[1]: session-26.scope: Deactivated successfully.
Dec  5 07:34:10 np0005546954 systemd-logind[789]: Session 26 logged out. Waiting for processes to exit.
Dec  5 07:34:10 np0005546954 systemd-logind[789]: Removed session 26.
Dec  5 07:34:11 np0005546954 python3.9[181385]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:34:11 np0005546954 python3.9[181506]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764938050.8309524-2322-23081447120973/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:34:12 np0005546954 python3.9[181656]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:34:13 np0005546954 python3.9[181732]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:34:14 np0005546954 python3.9[181882]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:34:14 np0005546954 python3.9[182003]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764938053.576444-2322-280382980653096/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:34:15 np0005546954 python3.9[182153]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:34:15 np0005546954 python3.9[182274]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764938054.8945966-2322-249734131779345/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=bc7f3bb7d4094c596a18178a888511b54e157ba4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:34:16 np0005546954 python3.9[182424]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:34:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:34:16.925 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:34:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:34:16.927 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:34:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:34:16.928 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:34:17 np0005546954 python3.9[182545]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764938056.113991-2322-109211469577447/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:34:17 np0005546954 python3.9[182695]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:34:18 np0005546954 python3.9[182816]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764938057.3063388-2322-250975478152621/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:34:19 np0005546954 python3.9[182968]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:34:20 np0005546954 python3.9[183120]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:34:20 np0005546954 podman[183226]: 2025-12-05 12:34:20.564994999 +0000 UTC m=+0.068148197 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  5 07:34:20 np0005546954 python3.9[183289]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 07:34:21 np0005546954 python3.9[183442]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:34:22 np0005546954 python3.9[183565]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1764938060.976544-2536-18503426245680/.source _original_basename=.md821_f2 follow=False checksum=31df3550e616279e43a9fdfa981b68017c4ed860 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Dec  5 07:34:23 np0005546954 python3.9[183717]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 07:34:23 np0005546954 python3.9[183869]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:34:24 np0005546954 python3.9[183990]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764938063.2971492-2588-32212512737159/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=211ffd0bca4b407eb4de45a749ef70116a7806fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:34:25 np0005546954 python3.9[184140]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:34:25 np0005546954 python3.9[184261]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764938064.648471-2619-230579896918296/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:34:26 np0005546954 python3.9[184413]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Dec  5 07:34:27 np0005546954 podman[184538]: 2025-12-05 12:34:27.566369339 +0000 UTC m=+0.106409926 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec  5 07:34:27 np0005546954 python3.9[184590]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  5 07:34:28 np0005546954 python3[184743]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Dec  5 07:34:29 np0005546954 podman[184781]: 2025-12-05 12:34:29.05252504 +0000 UTC m=+0.077356036 container create 51bc0995ce05197f39172c0011f26f6d0e2cad9af4bce8703b7a13fc12787ed3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, org.label-schema.build-date=20251125, config_id=edpm, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS)
Dec  5 07:34:29 np0005546954 podman[184781]: 2025-12-05 12:34:29.016520464 +0000 UTC m=+0.041351490 image pull 5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec  5 07:34:29 np0005546954 python3[184743]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Dec  5 07:34:29 np0005546954 python3.9[184971]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 07:34:31 np0005546954 python3.9[185125]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Dec  5 07:34:32 np0005546954 python3.9[185277]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  5 07:34:33 np0005546954 python3[185429]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Dec  5 07:34:33 np0005546954 podman[185465]: 2025-12-05 12:34:33.381546581 +0000 UTC m=+0.028264316 image pull 5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec  5 07:34:33 np0005546954 podman[185465]: 2025-12-05 12:34:33.529220362 +0000 UTC m=+0.175938067 container create 315359a1f41c2807b4acaa531eb8dfd4d78461436dbf2825efa04aa7c738dbca (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, config_id=edpm, container_name=nova_compute, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec  5 07:34:33 np0005546954 python3[185429]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Dec  5 07:34:34 np0005546954 python3.9[185653]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 07:34:35 np0005546954 python3.9[185807]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:34:35 np0005546954 python3.9[185958]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764938075.3876302-2802-228747122819310/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:34:36 np0005546954 python3.9[186034]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  5 07:34:36 np0005546954 systemd[1]: Reloading.
Dec  5 07:34:36 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:34:36 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:34:37 np0005546954 python3.9[186146]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 07:34:37 np0005546954 systemd[1]: Reloading.
Dec  5 07:34:37 np0005546954 podman[186148]: 2025-12-05 12:34:37.553265999 +0000 UTC m=+0.096668701 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3)
Dec  5 07:34:37 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:34:37 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:34:37 np0005546954 systemd[1]: Starting nova_compute container...
Dec  5 07:34:37 np0005546954 systemd[1]: Started libcrun container.
Dec  5 07:34:37 np0005546954 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f31841d93424af2814b3b81009358d18034b65eda328d3a1037107d28056002/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec  5 07:34:37 np0005546954 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f31841d93424af2814b3b81009358d18034b65eda328d3a1037107d28056002/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec  5 07:34:37 np0005546954 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f31841d93424af2814b3b81009358d18034b65eda328d3a1037107d28056002/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec  5 07:34:37 np0005546954 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f31841d93424af2814b3b81009358d18034b65eda328d3a1037107d28056002/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec  5 07:34:37 np0005546954 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f31841d93424af2814b3b81009358d18034b65eda328d3a1037107d28056002/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec  5 07:34:37 np0005546954 podman[186204]: 2025-12-05 12:34:37.986430084 +0000 UTC m=+0.126215628 container init 315359a1f41c2807b4acaa531eb8dfd4d78461436dbf2825efa04aa7c738dbca (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.schema-version=1.0)
Dec  5 07:34:37 np0005546954 podman[186204]: 2025-12-05 12:34:37.994262867 +0000 UTC m=+0.134048381 container start 315359a1f41c2807b4acaa531eb8dfd4d78461436dbf2825efa04aa7c738dbca (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251125, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:34:37 np0005546954 podman[186204]: nova_compute
Dec  5 07:34:38 np0005546954 nova_compute[186219]: + sudo -E kolla_set_configs
Dec  5 07:34:38 np0005546954 systemd[1]: Started nova_compute container.
Dec  5 07:34:38 np0005546954 nova_compute[186219]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  5 07:34:38 np0005546954 nova_compute[186219]: INFO:__main__:Validating config file
Dec  5 07:34:38 np0005546954 nova_compute[186219]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  5 07:34:38 np0005546954 nova_compute[186219]: INFO:__main__:Copying service configuration files
Dec  5 07:34:38 np0005546954 nova_compute[186219]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec  5 07:34:38 np0005546954 nova_compute[186219]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec  5 07:34:38 np0005546954 nova_compute[186219]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec  5 07:34:38 np0005546954 nova_compute[186219]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec  5 07:34:38 np0005546954 nova_compute[186219]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec  5 07:34:38 np0005546954 nova_compute[186219]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Dec  5 07:34:38 np0005546954 nova_compute[186219]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Dec  5 07:34:38 np0005546954 nova_compute[186219]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec  5 07:34:38 np0005546954 nova_compute[186219]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec  5 07:34:38 np0005546954 nova_compute[186219]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec  5 07:34:38 np0005546954 nova_compute[186219]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec  5 07:34:38 np0005546954 nova_compute[186219]: INFO:__main__:Deleting /etc/ceph
Dec  5 07:34:38 np0005546954 nova_compute[186219]: INFO:__main__:Creating directory /etc/ceph
Dec  5 07:34:38 np0005546954 nova_compute[186219]: INFO:__main__:Setting permission for /etc/ceph
Dec  5 07:34:38 np0005546954 nova_compute[186219]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec  5 07:34:38 np0005546954 nova_compute[186219]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec  5 07:34:38 np0005546954 nova_compute[186219]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec  5 07:34:38 np0005546954 nova_compute[186219]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec  5 07:34:38 np0005546954 nova_compute[186219]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec  5 07:34:38 np0005546954 nova_compute[186219]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec  5 07:34:38 np0005546954 nova_compute[186219]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec  5 07:34:38 np0005546954 nova_compute[186219]: INFO:__main__:Writing out command to execute
Dec  5 07:34:38 np0005546954 nova_compute[186219]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec  5 07:34:38 np0005546954 nova_compute[186219]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec  5 07:34:38 np0005546954 nova_compute[186219]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec  5 07:34:38 np0005546954 nova_compute[186219]: ++ cat /run_command
Dec  5 07:34:38 np0005546954 nova_compute[186219]: + CMD=nova-compute
Dec  5 07:34:38 np0005546954 nova_compute[186219]: + ARGS=
Dec  5 07:34:38 np0005546954 nova_compute[186219]: + sudo kolla_copy_cacerts
Dec  5 07:34:38 np0005546954 nova_compute[186219]: + [[ ! -n '' ]]
Dec  5 07:34:38 np0005546954 nova_compute[186219]: + . kolla_extend_start
Dec  5 07:34:38 np0005546954 nova_compute[186219]: Running command: 'nova-compute'
Dec  5 07:34:38 np0005546954 nova_compute[186219]: + echo 'Running command: '\''nova-compute'\'''
Dec  5 07:34:38 np0005546954 nova_compute[186219]: + umask 0022
Dec  5 07:34:38 np0005546954 nova_compute[186219]: + exec nova-compute
Dec  5 07:34:39 np0005546954 python3.9[186381]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 07:34:40 np0005546954 nova_compute[186219]: 2025-12-05 12:34:40.205 186223 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  5 07:34:40 np0005546954 nova_compute[186219]: 2025-12-05 12:34:40.205 186223 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  5 07:34:40 np0005546954 nova_compute[186219]: 2025-12-05 12:34:40.205 186223 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  5 07:34:40 np0005546954 nova_compute[186219]: 2025-12-05 12:34:40.206 186223 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Dec  5 07:34:40 np0005546954 python3.9[186531]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 07:34:40 np0005546954 nova_compute[186219]: 2025-12-05 12:34:40.332 186223 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:34:40 np0005546954 nova_compute[186219]: 2025-12-05 12:34:40.355 186223 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:34:40 np0005546954 nova_compute[186219]: 2025-12-05 12:34:40.355 186223 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Dec  5 07:34:40 np0005546954 nova_compute[186219]: 2025-12-05 12:34:40.865 186223 INFO nova.virt.driver [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:40.999 186223 INFO nova.compute.provider_config [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.040 186223 DEBUG oslo_concurrency.lockutils [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.040 186223 DEBUG oslo_concurrency.lockutils [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.041 186223 DEBUG oslo_concurrency.lockutils [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.041 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.041 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.041 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.041 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.042 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.042 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.042 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.042 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.042 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.043 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.043 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.043 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.043 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.043 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.044 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.044 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.044 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.044 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.044 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.045 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.045 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.045 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.045 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.045 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.046 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.046 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.046 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.046 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.047 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.047 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.047 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.047 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.047 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.048 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.048 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.048 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.048 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.048 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.049 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.049 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.049 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.049 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.049 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.050 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.050 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.050 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.050 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.050 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.051 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.051 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.051 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.051 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.052 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.052 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.052 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.052 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.052 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.053 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.053 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.053 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.053 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.053 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.054 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.054 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.054 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.054 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.054 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.054 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.055 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.055 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.055 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.055 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.055 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.056 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.056 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.056 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.056 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.056 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.057 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.057 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.057 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.057 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.057 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.058 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.058 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.058 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.058 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.058 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.059 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.059 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.059 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.059 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.059 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.060 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.060 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.060 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.060 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.060 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.061 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.061 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.061 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.061 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.061 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.062 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.062 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.062 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.062 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.062 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.063 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.063 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.063 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.063 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.063 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.064 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.064 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.064 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.064 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.064 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.064 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.065 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.065 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.065 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.065 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.065 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.066 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.066 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.066 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.066 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.066 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.067 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.067 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.067 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.067 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.067 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.068 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.068 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.068 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.068 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.068 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.069 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.069 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.069 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.069 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.069 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.070 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.070 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.070 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.070 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.070 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.071 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.071 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.071 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.071 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.071 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.072 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.072 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.072 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.072 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.072 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.073 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.073 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.073 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.073 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.073 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.074 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.074 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.074 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.074 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.074 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.075 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.075 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.075 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.075 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.075 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.076 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.076 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.076 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.076 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.077 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.077 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.077 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.077 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.077 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.077 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.078 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.078 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.078 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.078 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.079 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.079 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.079 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.079 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.079 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.079 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.080 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.080 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.080 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.080 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.080 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.081 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.081 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.081 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.081 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.081 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.082 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.082 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.082 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.082 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.082 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.083 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.083 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.083 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.083 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.083 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.083 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.084 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.084 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.084 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.084 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.084 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.085 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.085 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.085 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.085 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.085 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.085 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.085 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.086 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.086 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.086 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.086 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.086 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.086 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.086 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.087 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.087 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.087 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.087 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.087 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.087 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.087 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.088 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.088 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.088 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.088 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.088 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.088 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.088 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.089 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.089 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.089 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.089 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.089 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.089 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.090 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.090 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.090 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.090 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.090 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.090 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.091 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.091 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.091 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.091 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.091 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.091 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.091 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.092 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.092 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.092 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.092 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.092 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.092 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.092 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.093 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.093 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.093 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.093 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.093 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.093 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.093 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.094 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.094 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.094 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.095 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.095 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.095 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.095 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.095 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.095 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.096 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.096 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.096 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.096 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.096 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.096 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.096 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.097 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.097 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.097 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.097 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.097 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.097 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.098 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.098 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.098 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.098 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.098 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.098 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.098 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.099 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.099 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.099 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.099 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.099 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.099 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.100 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 python3.9[186685]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.100 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.100 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.100 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.100 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.100 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.100 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.101 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.101 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.101 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.101 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.101 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.101 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.101 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.102 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.102 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.102 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.102 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.102 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.102 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.102 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.103 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.103 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.103 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.103 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.103 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.103 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.103 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.104 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.104 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.104 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.104 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.104 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.104 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.104 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.105 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.105 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.105 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.105 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.105 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.106 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.106 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.106 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.106 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.106 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.106 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.106 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.107 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.107 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.107 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.107 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.107 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.107 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.107 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.108 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.108 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.108 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.108 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.108 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.108 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.108 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.108 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.109 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.109 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.109 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.109 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.109 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.109 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.109 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.110 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.110 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.110 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.110 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.110 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.110 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.110 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.111 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.111 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.111 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.111 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.111 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.111 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.111 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.112 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.112 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.112 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.112 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.112 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.112 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.112 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.113 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.113 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.113 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.113 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.113 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.113 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.113 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.114 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.114 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.114 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.114 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.114 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.114 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.114 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.115 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.115 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.115 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.115 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.115 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.115 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.115 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.116 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.116 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.116 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.116 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.116 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.116 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.116 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.117 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.117 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.117 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.117 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.117 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.117 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.117 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.117 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.118 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.118 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.118 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.118 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.118 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.118 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.118 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.119 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.119 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.119 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.119 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.119 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.120 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.120 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.120 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.120 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.120 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.120 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.120 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.121 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.121 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.121 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.121 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.121 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.121 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.121 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.121 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.122 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.122 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.122 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.122 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.122 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.122 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.123 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.123 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.123 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.123 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.123 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.123 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.124 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.124 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.124 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.124 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.124 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.125 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.125 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.125 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.125 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.125 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.125 186223 WARNING oslo_config.cfg [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec  5 07:34:41 np0005546954 nova_compute[186219]: live_migration_uri is deprecated for removal in favor of two other options that
Dec  5 07:34:41 np0005546954 nova_compute[186219]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec  5 07:34:41 np0005546954 nova_compute[186219]: and ``live_migration_inbound_addr`` respectively.
Dec  5 07:34:41 np0005546954 nova_compute[186219]: ).  Its value may be silently ignored in the future.#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.126 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.126 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.126 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.126 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.126 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.126 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.127 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.127 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.127 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.127 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.127 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.127 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.127 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.128 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.128 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.128 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.128 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.128 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.128 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.128 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.129 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.129 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.129 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.129 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.129 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.129 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.129 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.130 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.130 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.130 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.130 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.130 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.130 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.131 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.131 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.131 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.131 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.131 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.131 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.131 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.132 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.132 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.132 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.132 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.132 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.132 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.132 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.133 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.133 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.133 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.133 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.133 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.133 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.133 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.134 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.134 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.134 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.134 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.134 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.134 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.134 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.135 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.135 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.135 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.135 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.135 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.135 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.135 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.136 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.136 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.136 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.136 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.136 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.136 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.137 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.137 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.137 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.137 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.137 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.137 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.138 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.138 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.138 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.138 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.138 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.138 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.138 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.139 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.139 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.139 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.139 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.139 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.139 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.139 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.140 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.140 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.140 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.140 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.140 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.140 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.140 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.141 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.141 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.141 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.141 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.141 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.141 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.141 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.142 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.142 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.142 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.142 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.142 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.142 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.142 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.143 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.143 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.143 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.143 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.143 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.143 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.143 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.144 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.144 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.144 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.144 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.144 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.144 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.144 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.145 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.145 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.145 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.145 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.145 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.145 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.145 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.146 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.146 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.146 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.146 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.146 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.146 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.147 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.147 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.147 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.147 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.147 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.147 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.148 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.148 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.148 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.148 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.148 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.148 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.149 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.149 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.149 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.149 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.150 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.150 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.150 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.150 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.150 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.150 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.151 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.151 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.151 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.151 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.151 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.151 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.152 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.152 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.152 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.152 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.152 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.152 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.152 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.153 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.153 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.153 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.153 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.153 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.153 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.154 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.154 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.154 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.154 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.154 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.154 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.154 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.155 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.155 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.155 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.155 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.155 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.155 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.155 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.156 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.156 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.156 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.156 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.156 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.156 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.156 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.157 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.157 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.157 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.157 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.157 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.157 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.158 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.158 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.158 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.158 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.158 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.158 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.159 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.159 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.159 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.159 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.159 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.159 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.159 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.160 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.160 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.160 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.160 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.160 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.160 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.160 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.161 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.161 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.161 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.161 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.161 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.161 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.161 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.162 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.162 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.162 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.162 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.162 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.162 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.162 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.163 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.163 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.163 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.163 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.164 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.164 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.164 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.164 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.164 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.165 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.165 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.165 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.165 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.165 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.166 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.166 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.166 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.166 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.166 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.167 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.167 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.167 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.167 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.167 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.167 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.168 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.168 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.168 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.168 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.168 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.169 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.169 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.169 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.169 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.169 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.170 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.170 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.170 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.170 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.170 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.171 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.171 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.171 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.171 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.171 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.172 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.172 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.172 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.172 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.172 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.173 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.173 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.173 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.173 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.173 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.173 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.174 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.174 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.174 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.174 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.174 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.174 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.175 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.175 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.175 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.175 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.175 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.176 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.176 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.176 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.176 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.176 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.176 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.176 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.177 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.177 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.177 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.177 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.177 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.177 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.177 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.178 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.178 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.178 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.178 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.178 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.178 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.178 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.179 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.179 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.179 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.179 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.179 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.179 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.179 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.180 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.180 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.180 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.180 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.180 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.180 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.180 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.181 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.181 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.181 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.181 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.181 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.181 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.181 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.182 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.182 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.182 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.182 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.182 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.182 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.182 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.183 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.183 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.183 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.183 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.183 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.183 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.183 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.184 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.184 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.184 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.184 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.184 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.184 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.184 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.184 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.185 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.185 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.185 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.185 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.185 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.185 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.185 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.186 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.186 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.186 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.186 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.186 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.186 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.186 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.187 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.187 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.187 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.187 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.187 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.187 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.187 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.188 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.188 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.188 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.188 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.188 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.188 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.188 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.189 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.189 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.189 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.189 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.189 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.189 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.189 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.190 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.190 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.190 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.190 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.190 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.190 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.190 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.191 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.191 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.191 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.191 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.191 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.191 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.192 186223 DEBUG oslo_service.service [None req-567c9ae1-4330-4310-a026-fdb5679e3755 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.193 186223 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.213 186223 DEBUG nova.virt.libvirt.host [None req-34abcad0-ce45-4330-b001-3bb9c42f3480 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.214 186223 DEBUG nova.virt.libvirt.host [None req-34abcad0-ce45-4330-b001-3bb9c42f3480 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.214 186223 DEBUG nova.virt.libvirt.host [None req-34abcad0-ce45-4330-b001-3bb9c42f3480 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.214 186223 DEBUG nova.virt.libvirt.host [None req-34abcad0-ce45-4330-b001-3bb9c42f3480 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Dec  5 07:34:41 np0005546954 systemd[1]: Starting libvirt QEMU daemon...
Dec  5 07:34:41 np0005546954 systemd[1]: Started libvirt QEMU daemon.
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.309 186223 DEBUG nova.virt.libvirt.host [None req-34abcad0-ce45-4330-b001-3bb9c42f3480 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fd1d7159a60> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.313 186223 DEBUG nova.virt.libvirt.host [None req-34abcad0-ce45-4330-b001-3bb9c42f3480 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fd1d7159a60> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.314 186223 INFO nova.virt.libvirt.driver [None req-34abcad0-ce45-4330-b001-3bb9c42f3480 - - - - - -] Connection event '1' reason 'None'#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.331 186223 WARNING nova.virt.libvirt.driver [None req-34abcad0-ce45-4330-b001-3bb9c42f3480 - - - - - -] Cannot update service status on host "compute-1.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.#033[00m
Dec  5 07:34:41 np0005546954 nova_compute[186219]: 2025-12-05 12:34:41.332 186223 DEBUG nova.virt.libvirt.volume.mount [None req-34abcad0-ce45-4330-b001-3bb9c42f3480 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Dec  5 07:34:42 np0005546954 nova_compute[186219]: 2025-12-05 12:34:42.216 186223 INFO nova.virt.libvirt.host [None req-34abcad0-ce45-4330-b001-3bb9c42f3480 - - - - - -] Libvirt host capabilities <capabilities>
Dec  5 07:34:42 np0005546954 nova_compute[186219]: 
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <host>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <uuid>a16ad662-d426-4c8c-9ec3-e00cbbaff345</uuid>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <cpu>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <arch>x86_64</arch>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model>EPYC-Rome-v4</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <vendor>AMD</vendor>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <microcode version='16777317'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <signature family='23' model='49' stepping='0'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <maxphysaddr mode='emulate' bits='40'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature name='x2apic'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature name='tsc-deadline'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature name='osxsave'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature name='hypervisor'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature name='tsc_adjust'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature name='spec-ctrl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature name='stibp'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature name='arch-capabilities'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature name='ssbd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature name='cmp_legacy'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature name='topoext'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature name='virt-ssbd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature name='lbrv'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature name='tsc-scale'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature name='vmcb-clean'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature name='pause-filter'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature name='pfthreshold'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature name='svme-addr-chk'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature name='rdctl-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature name='skip-l1dfl-vmentry'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature name='mds-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature name='pschange-mc-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <pages unit='KiB' size='4'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <pages unit='KiB' size='2048'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <pages unit='KiB' size='1048576'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </cpu>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <power_management>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <suspend_mem/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <suspend_disk/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <suspend_hybrid/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </power_management>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <iommu support='no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <migration_features>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <live/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <uri_transports>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <uri_transport>tcp</uri_transport>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <uri_transport>rdma</uri_transport>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </uri_transports>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </migration_features>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <topology>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <cells num='1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <cell id='0'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:          <memory unit='KiB'>7864320</memory>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:          <pages unit='KiB' size='4'>1966080</pages>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:          <pages unit='KiB' size='2048'>0</pages>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:          <pages unit='KiB' size='1048576'>0</pages>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:          <distances>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:            <sibling id='0' value='10'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:          </distances>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:          <cpus num='8'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:          </cpus>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        </cell>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </cells>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </topology>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <cache>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </cache>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <secmodel>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model>selinux</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <doi>0</doi>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </secmodel>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <secmodel>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model>dac</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <doi>0</doi>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <baselabel type='kvm'>+107:+107</baselabel>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <baselabel type='qemu'>+107:+107</baselabel>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </secmodel>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  </host>
Dec  5 07:34:42 np0005546954 nova_compute[186219]: 
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <guest>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <os_type>hvm</os_type>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <arch name='i686'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <wordsize>32</wordsize>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <domain type='qemu'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <domain type='kvm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </arch>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <features>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <pae/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <nonpae/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <acpi default='on' toggle='yes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <apic default='on' toggle='no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <cpuselection/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <deviceboot/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <disksnapshot default='on' toggle='no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <externalSnapshot/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </features>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  </guest>
Dec  5 07:34:42 np0005546954 nova_compute[186219]: 
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <guest>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <os_type>hvm</os_type>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <arch name='x86_64'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <wordsize>64</wordsize>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <domain type='qemu'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <domain type='kvm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </arch>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <features>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <acpi default='on' toggle='yes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <apic default='on' toggle='no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <cpuselection/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <deviceboot/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <disksnapshot default='on' toggle='no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <externalSnapshot/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </features>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  </guest>
Dec  5 07:34:42 np0005546954 nova_compute[186219]: 
Dec  5 07:34:42 np0005546954 nova_compute[186219]: </capabilities>
Dec  5 07:34:42 np0005546954 nova_compute[186219]: #033[00m
Dec  5 07:34:42 np0005546954 nova_compute[186219]: 2025-12-05 12:34:42.228 186223 DEBUG nova.virt.libvirt.host [None req-34abcad0-ce45-4330-b001-3bb9c42f3480 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Dec  5 07:34:42 np0005546954 python3.9[186897]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec  5 07:34:42 np0005546954 nova_compute[186219]: 2025-12-05 12:34:42.275 186223 DEBUG nova.virt.libvirt.host [None req-34abcad0-ce45-4330-b001-3bb9c42f3480 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec  5 07:34:42 np0005546954 nova_compute[186219]: <domainCapabilities>
Dec  5 07:34:42 np0005546954 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <path>/usr/libexec/qemu-kvm</path>
Dec  5 07:34:42 np0005546954 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <domain>kvm</domain>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <machine>pc-q35-rhel9.8.0</machine>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <arch>i686</arch>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <vcpu max='4096'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <iothreads supported='yes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <os supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <enum name='firmware'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <loader supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='type'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>rom</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>pflash</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='readonly'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>yes</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>no</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='secure'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>no</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </loader>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  </os>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <cpu>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <mode name='host-passthrough' supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='hostPassthroughMigratable'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>on</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>off</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </mode>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <mode name='maximum' supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='maximumMigratable'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>on</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>off</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </mode>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <mode name='host-model' supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <vendor>AMD</vendor>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='x2apic'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='tsc-deadline'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='hypervisor'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='tsc_adjust'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='spec-ctrl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='stibp'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='ssbd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='cmp_legacy'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='overflow-recov'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='succor'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='ibrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='amd-ssbd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='virt-ssbd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='lbrv'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='tsc-scale'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='vmcb-clean'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='flushbyasid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='pause-filter'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='pfthreshold'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='svme-addr-chk'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='disable' name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </mode>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <mode name='custom' supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Broadwell'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Broadwell-IBRS'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Broadwell-noTSX'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Broadwell-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Broadwell-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Broadwell-v3'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Broadwell-v4'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Cascadelake-Server'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Cascadelake-Server-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Cascadelake-Server-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Cascadelake-Server-v3'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Cascadelake-Server-v4'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Cascadelake-Server-v5'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Cooperlake'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Cooperlake-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Cooperlake-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Denverton'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='mpx'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Denverton-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='mpx'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Denverton-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Denverton-v3'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Dhyana-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='EPYC-Genoa'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amd-psfd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='auto-ibrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='no-nested-data-bp'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='null-sel-clr-base'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='stibp-always-on'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='EPYC-Genoa-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amd-psfd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='auto-ibrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='no-nested-data-bp'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='null-sel-clr-base'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='stibp-always-on'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='EPYC-Milan'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='EPYC-Milan-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='EPYC-Milan-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amd-psfd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='no-nested-data-bp'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='null-sel-clr-base'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='stibp-always-on'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='EPYC-Rome'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='EPYC-Rome-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='EPYC-Rome-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='EPYC-Rome-v3'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='EPYC-v3'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='EPYC-v4'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='GraniteRapids'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-fp16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-int8'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-tile'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-fp16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fbsdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrc'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fzrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='mcdt-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pbrsb-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='prefetchiti'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='psdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='serialize'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xfd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='GraniteRapids-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-fp16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-int8'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-tile'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-fp16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fbsdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrc'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fzrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='mcdt-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pbrsb-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='prefetchiti'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='psdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='serialize'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xfd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='GraniteRapids-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-fp16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-int8'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-tile'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx10'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx10-128'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx10-256'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx10-512'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-fp16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='cldemote'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fbsdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrc'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fzrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='mcdt-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdir64b'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdiri'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pbrsb-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='prefetchiti'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='psdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='serialize'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ss'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xfd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Haswell'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Haswell-IBRS'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Haswell-noTSX'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Haswell-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Haswell-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Haswell-v3'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Haswell-v4'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Icelake-Server'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Icelake-Server-noTSX'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Icelake-Server-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Icelake-Server-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Icelake-Server-v3'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Icelake-Server-v4'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Icelake-Server-v5'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Icelake-Server-v6'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Icelake-Server-v7'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='IvyBridge'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='IvyBridge-IBRS'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='IvyBridge-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='IvyBridge-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='KnightsMill'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-4fmaps'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-4vnniw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512er'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512pf'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ss'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='KnightsMill-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-4fmaps'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-4vnniw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512er'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512pf'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ss'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Opteron_G4'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fma4'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xop'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Opteron_G4-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fma4'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xop'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Opteron_G5'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fma4'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='tbm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xop'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Opteron_G5-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fma4'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='tbm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xop'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='SapphireRapids'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-int8'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-tile'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-fp16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrc'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fzrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='serialize'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xfd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='SapphireRapids-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-int8'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-tile'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-fp16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrc'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fzrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='serialize'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xfd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='SapphireRapids-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-int8'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-tile'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-fp16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fbsdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrc'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fzrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='psdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='serialize'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xfd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='SapphireRapids-v3'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-int8'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-tile'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-fp16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='cldemote'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fbsdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrc'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fzrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdir64b'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdiri'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='psdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='serialize'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ss'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xfd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='SierraForest'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-ne-convert'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-vnni-int8'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='cmpccxadd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fbsdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='mcdt-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pbrsb-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='psdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='serialize'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='SierraForest-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-ne-convert'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-vnni-int8'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='cmpccxadd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fbsdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='mcdt-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pbrsb-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='psdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='serialize'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Client'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Client-IBRS'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Client-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Client-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Client-v3'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Client-v4'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Server'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Server-IBRS'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Server-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Server-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Server-v3'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Server-v4'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Server-v5'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Snowridge'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='cldemote'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='core-capability'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdir64b'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdiri'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='mpx'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='split-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Snowridge-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='cldemote'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='core-capability'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdir64b'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdiri'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='mpx'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='split-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Snowridge-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='cldemote'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='core-capability'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdir64b'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdiri'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='split-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Snowridge-v3'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='cldemote'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='core-capability'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdir64b'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdiri'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='split-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Snowridge-v4'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='cldemote'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdir64b'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdiri'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='athlon'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='3dnow'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='3dnowext'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='athlon-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='3dnow'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='3dnowext'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='core2duo'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ss'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='core2duo-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ss'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='coreduo'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ss'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='coreduo-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ss'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='n270'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ss'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='n270-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ss'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='phenom'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='3dnow'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='3dnowext'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='phenom-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='3dnow'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='3dnowext'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </mode>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  </cpu>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <memoryBacking supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <enum name='sourceType'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <value>file</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <value>anonymous</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <value>memfd</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  </memoryBacking>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <devices>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <disk supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='diskDevice'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>disk</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>cdrom</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>floppy</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>lun</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='bus'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>fdc</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>scsi</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>virtio</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>usb</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>sata</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='model'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>virtio</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>virtio-transitional</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>virtio-non-transitional</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </disk>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <graphics supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='type'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>vnc</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>egl-headless</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>dbus</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </graphics>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <video supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='modelType'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>vga</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>cirrus</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>virtio</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>none</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>bochs</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>ramfb</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </video>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <hostdev supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='mode'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>subsystem</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='startupPolicy'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>default</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>mandatory</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>requisite</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>optional</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='subsysType'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>usb</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>pci</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>scsi</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='capsType'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='pciBackend'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </hostdev>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <rng supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='model'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>virtio</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>virtio-transitional</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>virtio-non-transitional</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='backendModel'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>random</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>egd</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>builtin</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </rng>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <filesystem supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='driverType'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>path</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>handle</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>virtiofs</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </filesystem>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <tpm supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='model'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>tpm-tis</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>tpm-crb</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='backendModel'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>emulator</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>external</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='backendVersion'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>2.0</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </tpm>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <redirdev supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='bus'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>usb</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </redirdev>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <channel supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='type'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>pty</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>unix</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </channel>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <crypto supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='model'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='type'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>qemu</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='backendModel'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>builtin</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </crypto>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <interface supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='backendType'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>default</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>passt</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </interface>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <panic supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='model'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>isa</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>hyperv</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </panic>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <console supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='type'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>null</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>vc</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>pty</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>dev</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>file</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>pipe</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>stdio</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>udp</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>tcp</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>unix</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>qemu-vdagent</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>dbus</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </console>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  </devices>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <features>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <gic supported='no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <vmcoreinfo supported='yes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <genid supported='yes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <backingStoreInput supported='yes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <backup supported='yes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <async-teardown supported='yes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <ps2 supported='yes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <sev supported='no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <sgx supported='no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <hyperv supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='features'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>relaxed</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>vapic</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>spinlocks</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>vpindex</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>runtime</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>synic</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>stimer</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>reset</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>vendor_id</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>frequencies</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>reenlightenment</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>tlbflush</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>ipi</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>avic</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>emsr_bitmap</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>xmm_input</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <defaults>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <spinlocks>4095</spinlocks>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <stimer_direct>on</stimer_direct>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <tlbflush_direct>on</tlbflush_direct>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <tlbflush_extended>on</tlbflush_extended>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </defaults>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </hyperv>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <launchSecurity supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='sectype'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>tdx</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </launchSecurity>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  </features>
Dec  5 07:34:42 np0005546954 nova_compute[186219]: </domainCapabilities>
Dec  5 07:34:42 np0005546954 nova_compute[186219]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  5 07:34:42 np0005546954 nova_compute[186219]: 2025-12-05 12:34:42.284 186223 DEBUG nova.virt.libvirt.host [None req-34abcad0-ce45-4330-b001-3bb9c42f3480 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec  5 07:34:42 np0005546954 nova_compute[186219]: <domainCapabilities>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <path>/usr/libexec/qemu-kvm</path>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <domain>kvm</domain>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <machine>pc-i440fx-rhel7.6.0</machine>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <arch>i686</arch>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <vcpu max='240'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <iothreads supported='yes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <os supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <enum name='firmware'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <loader supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='type'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>rom</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>pflash</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='readonly'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>yes</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>no</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='secure'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>no</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </loader>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  </os>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <cpu>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <mode name='host-passthrough' supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='hostPassthroughMigratable'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>on</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>off</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </mode>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <mode name='maximum' supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='maximumMigratable'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>on</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>off</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </mode>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <mode name='host-model' supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <vendor>AMD</vendor>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='x2apic'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='tsc-deadline'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='hypervisor'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='tsc_adjust'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='spec-ctrl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='stibp'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='ssbd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='cmp_legacy'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='overflow-recov'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='succor'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='ibrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='amd-ssbd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='virt-ssbd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='lbrv'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='tsc-scale'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='vmcb-clean'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='flushbyasid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='pause-filter'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='pfthreshold'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='svme-addr-chk'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='disable' name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </mode>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <mode name='custom' supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Broadwell'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Broadwell-IBRS'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Broadwell-noTSX'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Broadwell-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Broadwell-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Broadwell-v3'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Broadwell-v4'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Cascadelake-Server'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Cascadelake-Server-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Cascadelake-Server-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Cascadelake-Server-v3'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Cascadelake-Server-v4'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Cascadelake-Server-v5'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Cooperlake'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Cooperlake-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Cooperlake-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Denverton'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='mpx'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Denverton-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='mpx'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Denverton-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Denverton-v3'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Dhyana-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='EPYC-Genoa'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amd-psfd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='auto-ibrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='no-nested-data-bp'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='null-sel-clr-base'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='stibp-always-on'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='EPYC-Genoa-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amd-psfd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='auto-ibrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='no-nested-data-bp'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='null-sel-clr-base'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='stibp-always-on'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='EPYC-Milan'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='EPYC-Milan-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='EPYC-Milan-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amd-psfd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='no-nested-data-bp'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='null-sel-clr-base'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='stibp-always-on'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='EPYC-Rome'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='EPYC-Rome-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='EPYC-Rome-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='EPYC-Rome-v3'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='EPYC-v3'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='EPYC-v4'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='GraniteRapids'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-fp16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-int8'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-tile'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-fp16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fbsdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrc'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fzrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='mcdt-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pbrsb-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='prefetchiti'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='psdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='serialize'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xfd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='GraniteRapids-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-fp16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-int8'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-tile'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-fp16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fbsdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrc'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fzrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='mcdt-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pbrsb-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='prefetchiti'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='psdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='serialize'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xfd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='GraniteRapids-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-fp16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-int8'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-tile'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx10'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx10-128'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx10-256'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx10-512'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-fp16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='cldemote'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fbsdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrc'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fzrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='mcdt-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdir64b'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdiri'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pbrsb-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='prefetchiti'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='psdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='serialize'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ss'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xfd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Haswell'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Haswell-IBRS'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Haswell-noTSX'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Haswell-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Haswell-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Haswell-v3'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Haswell-v4'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Icelake-Server'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Icelake-Server-noTSX'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Icelake-Server-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Icelake-Server-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Icelake-Server-v3'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Icelake-Server-v4'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Icelake-Server-v5'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Icelake-Server-v6'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Icelake-Server-v7'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='IvyBridge'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='IvyBridge-IBRS'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='IvyBridge-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='IvyBridge-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='KnightsMill'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-4fmaps'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-4vnniw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512er'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512pf'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ss'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='KnightsMill-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-4fmaps'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-4vnniw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512er'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512pf'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ss'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Opteron_G4'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fma4'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xop'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Opteron_G4-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fma4'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xop'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Opteron_G5'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fma4'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='tbm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xop'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Opteron_G5-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fma4'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='tbm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xop'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='SapphireRapids'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-int8'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-tile'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-fp16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrc'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fzrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='serialize'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xfd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='SapphireRapids-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-int8'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-tile'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-fp16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrc'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fzrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='serialize'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xfd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='SapphireRapids-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-int8'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-tile'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-fp16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fbsdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrc'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fzrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='psdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='serialize'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xfd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='SapphireRapids-v3'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-int8'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-tile'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-fp16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='cldemote'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fbsdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrc'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fzrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdir64b'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdiri'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='psdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='serialize'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ss'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xfd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='SierraForest'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-ne-convert'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-vnni-int8'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='cmpccxadd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fbsdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='mcdt-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pbrsb-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='psdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='serialize'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='SierraForest-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-ne-convert'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-vnni-int8'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='cmpccxadd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fbsdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='mcdt-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pbrsb-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='psdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='serialize'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Client'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Client-IBRS'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Client-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Client-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Client-v3'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Client-v4'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Server'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Server-IBRS'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Server-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Server-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Server-v3'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Server-v4'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Server-v5'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Snowridge'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='cldemote'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='core-capability'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdir64b'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdiri'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='mpx'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='split-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Snowridge-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='cldemote'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='core-capability'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdir64b'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdiri'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='mpx'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='split-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Snowridge-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='cldemote'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='core-capability'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdir64b'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdiri'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='split-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Snowridge-v3'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='cldemote'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='core-capability'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdir64b'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdiri'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='split-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Snowridge-v4'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='cldemote'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdir64b'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdiri'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='athlon'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='3dnow'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='3dnowext'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='athlon-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='3dnow'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='3dnowext'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='core2duo'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ss'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='core2duo-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ss'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='coreduo'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ss'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='coreduo-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ss'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='n270'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ss'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='n270-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ss'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='phenom'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='3dnow'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='3dnowext'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='phenom-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='3dnow'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='3dnowext'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </mode>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  </cpu>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <memoryBacking supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <enum name='sourceType'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <value>file</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <value>anonymous</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <value>memfd</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  </memoryBacking>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <devices>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <disk supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='diskDevice'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>disk</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>cdrom</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>floppy</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>lun</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='bus'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>ide</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>fdc</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>scsi</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>virtio</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>usb</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>sata</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='model'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>virtio</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>virtio-transitional</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>virtio-non-transitional</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </disk>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <graphics supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='type'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>vnc</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>egl-headless</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>dbus</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </graphics>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <video supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='modelType'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>vga</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>cirrus</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>virtio</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>none</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>bochs</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>ramfb</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </video>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <hostdev supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='mode'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>subsystem</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='startupPolicy'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>default</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>mandatory</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>requisite</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>optional</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='subsysType'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>usb</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>pci</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>scsi</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='capsType'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='pciBackend'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </hostdev>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <rng supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='model'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>virtio</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>virtio-transitional</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>virtio-non-transitional</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='backendModel'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>random</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>egd</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>builtin</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </rng>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <filesystem supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='driverType'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>path</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>handle</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>virtiofs</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </filesystem>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <tpm supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='model'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>tpm-tis</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>tpm-crb</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='backendModel'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>emulator</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>external</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='backendVersion'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>2.0</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </tpm>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <redirdev supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='bus'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>usb</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </redirdev>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <channel supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='type'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>pty</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>unix</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </channel>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <crypto supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='model'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='type'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>qemu</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='backendModel'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>builtin</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </crypto>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <interface supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='backendType'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>default</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>passt</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </interface>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <panic supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='model'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>isa</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>hyperv</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </panic>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <console supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='type'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>null</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>vc</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>pty</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>dev</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>file</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>pipe</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>stdio</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>udp</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>tcp</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>unix</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>qemu-vdagent</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>dbus</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </console>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  </devices>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <features>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <gic supported='no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <vmcoreinfo supported='yes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <genid supported='yes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <backingStoreInput supported='yes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <backup supported='yes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <async-teardown supported='yes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <ps2 supported='yes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <sev supported='no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <sgx supported='no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <hyperv supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='features'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>relaxed</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>vapic</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>spinlocks</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>vpindex</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>runtime</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>synic</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>stimer</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>reset</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>vendor_id</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>frequencies</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>reenlightenment</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>tlbflush</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>ipi</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>avic</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>emsr_bitmap</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>xmm_input</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <defaults>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <spinlocks>4095</spinlocks>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <stimer_direct>on</stimer_direct>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <tlbflush_direct>on</tlbflush_direct>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <tlbflush_extended>on</tlbflush_extended>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </defaults>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </hyperv>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <launchSecurity supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='sectype'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>tdx</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </launchSecurity>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  </features>
Dec  5 07:34:42 np0005546954 nova_compute[186219]: </domainCapabilities>
Dec  5 07:34:42 np0005546954 nova_compute[186219]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  5 07:34:42 np0005546954 nova_compute[186219]: 2025-12-05 12:34:42.343 186223 DEBUG nova.virt.libvirt.host [None req-34abcad0-ce45-4330-b001-3bb9c42f3480 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Dec  5 07:34:42 np0005546954 nova_compute[186219]: 2025-12-05 12:34:42.347 186223 DEBUG nova.virt.libvirt.host [None req-34abcad0-ce45-4330-b001-3bb9c42f3480 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec  5 07:34:42 np0005546954 nova_compute[186219]: <domainCapabilities>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <path>/usr/libexec/qemu-kvm</path>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <domain>kvm</domain>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <machine>pc-q35-rhel9.8.0</machine>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <arch>x86_64</arch>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <vcpu max='4096'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <iothreads supported='yes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <os supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <enum name='firmware'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <value>efi</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <loader supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='type'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>rom</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>pflash</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='readonly'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>yes</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>no</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='secure'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>yes</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>no</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </loader>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  </os>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <cpu>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <mode name='host-passthrough' supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='hostPassthroughMigratable'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>on</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>off</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </mode>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <mode name='maximum' supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='maximumMigratable'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>on</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>off</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </mode>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <mode name='host-model' supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <vendor>AMD</vendor>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='x2apic'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='tsc-deadline'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='hypervisor'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='tsc_adjust'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='spec-ctrl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='stibp'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='ssbd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='cmp_legacy'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='overflow-recov'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='succor'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='ibrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='amd-ssbd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='virt-ssbd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='lbrv'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='tsc-scale'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='vmcb-clean'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='flushbyasid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='pause-filter'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='pfthreshold'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='svme-addr-chk'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='disable' name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </mode>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <mode name='custom' supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Broadwell'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Broadwell-IBRS'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Broadwell-noTSX'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Broadwell-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Broadwell-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Broadwell-v3'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Broadwell-v4'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Cascadelake-Server'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Cascadelake-Server-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Cascadelake-Server-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Cascadelake-Server-v3'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Cascadelake-Server-v4'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Cascadelake-Server-v5'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Cooperlake'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Cooperlake-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Cooperlake-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Denverton'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='mpx'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Denverton-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='mpx'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Denverton-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Denverton-v3'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Dhyana-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='EPYC-Genoa'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amd-psfd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='auto-ibrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='no-nested-data-bp'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='null-sel-clr-base'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='stibp-always-on'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='EPYC-Genoa-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amd-psfd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='auto-ibrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='no-nested-data-bp'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='null-sel-clr-base'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='stibp-always-on'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='EPYC-Milan'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='EPYC-Milan-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='EPYC-Milan-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amd-psfd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='no-nested-data-bp'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='null-sel-clr-base'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='stibp-always-on'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='EPYC-Rome'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='EPYC-Rome-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='EPYC-Rome-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='EPYC-Rome-v3'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='EPYC-v3'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='EPYC-v4'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='GraniteRapids'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-fp16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-int8'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-tile'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-fp16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fbsdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrc'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fzrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='mcdt-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pbrsb-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='prefetchiti'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='psdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='serialize'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xfd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='GraniteRapids-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-fp16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-int8'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-tile'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-fp16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fbsdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrc'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fzrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='mcdt-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pbrsb-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='prefetchiti'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='psdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='serialize'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xfd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='GraniteRapids-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-fp16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-int8'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-tile'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx10'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx10-128'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx10-256'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx10-512'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-fp16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='cldemote'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fbsdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrc'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fzrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='mcdt-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdir64b'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdiri'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pbrsb-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='prefetchiti'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='psdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='serialize'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ss'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xfd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Haswell'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Haswell-IBRS'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Haswell-noTSX'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Haswell-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Haswell-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Haswell-v3'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Haswell-v4'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Icelake-Server'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Icelake-Server-noTSX'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Icelake-Server-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Icelake-Server-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Icelake-Server-v3'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Icelake-Server-v4'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Icelake-Server-v5'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Icelake-Server-v6'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Icelake-Server-v7'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='IvyBridge'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='IvyBridge-IBRS'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='IvyBridge-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='IvyBridge-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='KnightsMill'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-4fmaps'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-4vnniw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512er'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512pf'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ss'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='KnightsMill-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-4fmaps'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-4vnniw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512er'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512pf'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ss'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Opteron_G4'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fma4'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xop'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Opteron_G4-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fma4'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xop'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Opteron_G5'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fma4'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='tbm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xop'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Opteron_G5-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fma4'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='tbm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xop'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='SapphireRapids'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-int8'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-tile'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-fp16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrc'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fzrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='serialize'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xfd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='SapphireRapids-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-int8'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-tile'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-fp16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrc'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fzrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='serialize'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xfd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='SapphireRapids-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-int8'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-tile'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-fp16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fbsdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrc'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fzrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='psdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='serialize'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xfd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='SapphireRapids-v3'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-int8'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-tile'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-fp16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='cldemote'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fbsdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrc'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fzrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdir64b'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdiri'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='psdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='serialize'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ss'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xfd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='SierraForest'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-ne-convert'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-vnni-int8'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='cmpccxadd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fbsdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='mcdt-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pbrsb-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='psdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='serialize'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='SierraForest-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-ne-convert'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-vnni-int8'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='cmpccxadd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fbsdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='mcdt-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pbrsb-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='psdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='serialize'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Client'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Client-IBRS'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Client-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Client-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Client-v3'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Client-v4'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Server'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Server-IBRS'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Server-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Server-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Server-v3'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Server-v4'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Server-v5'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Snowridge'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='cldemote'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='core-capability'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdir64b'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdiri'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='mpx'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='split-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Snowridge-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='cldemote'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='core-capability'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdir64b'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdiri'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='mpx'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='split-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Snowridge-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='cldemote'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='core-capability'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdir64b'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdiri'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='split-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Snowridge-v3'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='cldemote'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='core-capability'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdir64b'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdiri'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='split-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Snowridge-v4'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='cldemote'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdir64b'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdiri'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='athlon'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='3dnow'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='3dnowext'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='athlon-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='3dnow'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='3dnowext'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='core2duo'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ss'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='core2duo-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ss'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='coreduo'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ss'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='coreduo-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ss'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='n270'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ss'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='n270-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ss'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='phenom'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='3dnow'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='3dnowext'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='phenom-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='3dnow'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='3dnowext'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </mode>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  </cpu>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <memoryBacking supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <enum name='sourceType'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <value>file</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <value>anonymous</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <value>memfd</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  </memoryBacking>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <devices>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <disk supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='diskDevice'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>disk</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>cdrom</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>floppy</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>lun</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='bus'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>fdc</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>scsi</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>virtio</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>usb</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>sata</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='model'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>virtio</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>virtio-transitional</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>virtio-non-transitional</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </disk>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <graphics supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='type'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>vnc</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>egl-headless</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>dbus</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </graphics>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <video supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='modelType'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>vga</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>cirrus</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>virtio</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>none</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>bochs</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>ramfb</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </video>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <hostdev supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='mode'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>subsystem</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='startupPolicy'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>default</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>mandatory</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>requisite</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>optional</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='subsysType'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>usb</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>pci</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>scsi</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='capsType'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='pciBackend'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </hostdev>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <rng supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='model'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>virtio</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>virtio-transitional</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>virtio-non-transitional</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='backendModel'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>random</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>egd</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>builtin</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </rng>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <filesystem supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='driverType'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>path</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>handle</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>virtiofs</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </filesystem>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <tpm supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='model'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>tpm-tis</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>tpm-crb</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='backendModel'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>emulator</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>external</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='backendVersion'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>2.0</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </tpm>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <redirdev supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='bus'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>usb</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </redirdev>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <channel supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='type'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>pty</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>unix</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </channel>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <crypto supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='model'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='type'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>qemu</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='backendModel'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>builtin</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </crypto>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <interface supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='backendType'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>default</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>passt</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </interface>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <panic supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='model'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>isa</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>hyperv</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </panic>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <console supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='type'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>null</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>vc</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>pty</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>dev</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>file</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>pipe</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>stdio</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>udp</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>tcp</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>unix</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>qemu-vdagent</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>dbus</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </console>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  </devices>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <features>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <gic supported='no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <vmcoreinfo supported='yes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <genid supported='yes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <backingStoreInput supported='yes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <backup supported='yes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <async-teardown supported='yes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <ps2 supported='yes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <sev supported='no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <sgx supported='no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <hyperv supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='features'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>relaxed</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>vapic</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>spinlocks</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>vpindex</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>runtime</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>synic</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>stimer</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>reset</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>vendor_id</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>frequencies</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>reenlightenment</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>tlbflush</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>ipi</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>avic</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>emsr_bitmap</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>xmm_input</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <defaults>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <spinlocks>4095</spinlocks>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <stimer_direct>on</stimer_direct>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <tlbflush_direct>on</tlbflush_direct>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <tlbflush_extended>on</tlbflush_extended>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </defaults>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </hyperv>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <launchSecurity supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='sectype'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>tdx</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </launchSecurity>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  </features>
Dec  5 07:34:42 np0005546954 nova_compute[186219]: </domainCapabilities>
Dec  5 07:34:42 np0005546954 nova_compute[186219]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  5 07:34:42 np0005546954 nova_compute[186219]: 2025-12-05 12:34:42.423 186223 DEBUG nova.virt.libvirt.host [None req-34abcad0-ce45-4330-b001-3bb9c42f3480 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec  5 07:34:42 np0005546954 nova_compute[186219]: <domainCapabilities>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <path>/usr/libexec/qemu-kvm</path>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <domain>kvm</domain>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <machine>pc-i440fx-rhel7.6.0</machine>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <arch>x86_64</arch>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <vcpu max='240'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <iothreads supported='yes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <os supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <enum name='firmware'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <loader supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='type'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>rom</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>pflash</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='readonly'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>yes</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>no</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='secure'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>no</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </loader>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  </os>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <cpu>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <mode name='host-passthrough' supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='hostPassthroughMigratable'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>on</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>off</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </mode>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <mode name='maximum' supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='maximumMigratable'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>on</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>off</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </mode>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <mode name='host-model' supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <vendor>AMD</vendor>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='x2apic'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='tsc-deadline'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='hypervisor'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='tsc_adjust'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='spec-ctrl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='stibp'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='ssbd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='cmp_legacy'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='overflow-recov'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='succor'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='ibrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='amd-ssbd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='virt-ssbd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='lbrv'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='tsc-scale'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='vmcb-clean'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='flushbyasid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='pause-filter'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='pfthreshold'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='svme-addr-chk'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <feature policy='disable' name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </mode>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <mode name='custom' supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Broadwell'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Broadwell-IBRS'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Broadwell-noTSX'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Broadwell-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Broadwell-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Broadwell-v3'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Broadwell-v4'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Cascadelake-Server'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Cascadelake-Server-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Cascadelake-Server-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Cascadelake-Server-v3'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Cascadelake-Server-v4'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Cascadelake-Server-v5'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Cooperlake'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Cooperlake-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Cooperlake-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Denverton'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='mpx'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Denverton-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='mpx'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Denverton-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Denverton-v3'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Dhyana-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='EPYC-Genoa'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amd-psfd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='auto-ibrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='no-nested-data-bp'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='null-sel-clr-base'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='stibp-always-on'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='EPYC-Genoa-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amd-psfd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='auto-ibrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='no-nested-data-bp'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='null-sel-clr-base'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='stibp-always-on'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='EPYC-Milan'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='EPYC-Milan-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='EPYC-Milan-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amd-psfd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='no-nested-data-bp'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='null-sel-clr-base'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='stibp-always-on'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='EPYC-Rome'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='EPYC-Rome-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='EPYC-Rome-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='EPYC-Rome-v3'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='EPYC-v3'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='EPYC-v4'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='GraniteRapids'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-fp16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-int8'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-tile'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-fp16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fbsdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrc'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fzrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='mcdt-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pbrsb-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='prefetchiti'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='psdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='serialize'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xfd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='GraniteRapids-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-fp16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-int8'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-tile'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-fp16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fbsdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrc'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fzrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='mcdt-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pbrsb-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='prefetchiti'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='psdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='serialize'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xfd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='GraniteRapids-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-fp16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-int8'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-tile'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx10'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx10-128'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx10-256'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx10-512'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-fp16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='cldemote'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fbsdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrc'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fzrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='mcdt-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdir64b'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdiri'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pbrsb-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='prefetchiti'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='psdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='serialize'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ss'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xfd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Haswell'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Haswell-IBRS'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Haswell-noTSX'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Haswell-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Haswell-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Haswell-v3'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Haswell-v4'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Icelake-Server'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Icelake-Server-noTSX'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Icelake-Server-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Icelake-Server-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Icelake-Server-v3'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Icelake-Server-v4'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Icelake-Server-v5'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Icelake-Server-v6'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Icelake-Server-v7'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='IvyBridge'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='IvyBridge-IBRS'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='IvyBridge-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='IvyBridge-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='KnightsMill'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-4fmaps'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-4vnniw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512er'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512pf'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ss'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='KnightsMill-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-4fmaps'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-4vnniw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512er'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512pf'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ss'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Opteron_G4'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fma4'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xop'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Opteron_G4-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fma4'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xop'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Opteron_G5'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fma4'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='tbm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xop'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Opteron_G5-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fma4'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='tbm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xop'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='SapphireRapids'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-int8'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-tile'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-fp16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrc'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fzrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='serialize'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xfd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='SapphireRapids-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-int8'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-tile'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-fp16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrc'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fzrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='serialize'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xfd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='SapphireRapids-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-int8'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-tile'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-fp16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fbsdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrc'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fzrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='psdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='serialize'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xfd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='SapphireRapids-v3'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-int8'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='amx-tile'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-bf16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-fp16'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bitalg'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='cldemote'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fbsdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrc'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fzrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='la57'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdir64b'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdiri'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='psdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='serialize'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ss'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='taa-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xfd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='SierraForest'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-ne-convert'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-vnni-int8'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='cmpccxadd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fbsdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='mcdt-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pbrsb-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='psdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='serialize'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='SierraForest-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-ifma'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-ne-convert'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-vnni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx-vnni-int8'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='cmpccxadd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fbsdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='fsrs'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ibrs-all'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='mcdt-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pbrsb-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='psdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='serialize'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vaes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Client'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Client-IBRS'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Client-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Client-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Client-v3'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Client-v4'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Server'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Server-IBRS'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Server-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Server-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='hle'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='rtm'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Server-v3'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Server-v4'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Skylake-Server-v5'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512bw'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512cd'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512dq'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512f'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='avx512vl'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='invpcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pcid'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='pku'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Snowridge'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='cldemote'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='core-capability'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdir64b'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdiri'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='mpx'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='split-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Snowridge-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='cldemote'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='core-capability'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdir64b'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdiri'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='mpx'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='split-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Snowridge-v2'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='cldemote'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='core-capability'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdir64b'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdiri'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='split-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Snowridge-v3'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='cldemote'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='core-capability'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdir64b'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdiri'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='split-lock-detect'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='Snowridge-v4'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='cldemote'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='erms'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='gfni'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdir64b'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='movdiri'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='xsaves'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='athlon'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='3dnow'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='3dnowext'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='athlon-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='3dnow'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='3dnowext'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='core2duo'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ss'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='core2duo-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ss'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='coreduo'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ss'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='coreduo-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ss'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='n270'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ss'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='n270-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='ss'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='phenom'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='3dnow'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='3dnowext'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <blockers model='phenom-v1'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='3dnow'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <feature name='3dnowext'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </blockers>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </mode>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  </cpu>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <memoryBacking supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <enum name='sourceType'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <value>file</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <value>anonymous</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <value>memfd</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  </memoryBacking>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <devices>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <disk supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='diskDevice'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>disk</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>cdrom</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>floppy</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>lun</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='bus'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>ide</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>fdc</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>scsi</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>virtio</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>usb</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>sata</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='model'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>virtio</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>virtio-transitional</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>virtio-non-transitional</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </disk>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <graphics supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='type'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>vnc</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>egl-headless</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>dbus</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </graphics>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <video supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='modelType'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>vga</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>cirrus</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>virtio</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>none</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>bochs</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>ramfb</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </video>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <hostdev supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='mode'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>subsystem</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='startupPolicy'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>default</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>mandatory</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>requisite</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>optional</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='subsysType'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>usb</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>pci</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>scsi</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='capsType'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='pciBackend'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </hostdev>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <rng supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='model'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>virtio</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>virtio-transitional</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>virtio-non-transitional</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='backendModel'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>random</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>egd</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>builtin</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </rng>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <filesystem supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='driverType'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>path</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>handle</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>virtiofs</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </filesystem>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <tpm supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='model'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>tpm-tis</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>tpm-crb</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='backendModel'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>emulator</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>external</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='backendVersion'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>2.0</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </tpm>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <redirdev supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='bus'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>usb</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </redirdev>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <channel supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='type'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>pty</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>unix</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </channel>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <crypto supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='model'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='type'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>qemu</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='backendModel'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>builtin</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </crypto>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <interface supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='backendType'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>default</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>passt</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </interface>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <panic supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='model'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>isa</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>hyperv</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </panic>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <console supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='type'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>null</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>vc</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>pty</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>dev</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>file</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>pipe</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>stdio</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>udp</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>tcp</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>unix</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>qemu-vdagent</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>dbus</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </console>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  </devices>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <features>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <gic supported='no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <vmcoreinfo supported='yes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <genid supported='yes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <backingStoreInput supported='yes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <backup supported='yes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <async-teardown supported='yes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <ps2 supported='yes'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <sev supported='no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <sgx supported='no'/>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <hyperv supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='features'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>relaxed</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>vapic</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>spinlocks</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>vpindex</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>runtime</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>synic</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>stimer</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>reset</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>vendor_id</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>frequencies</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>reenlightenment</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>tlbflush</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>ipi</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>avic</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>emsr_bitmap</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>xmm_input</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <defaults>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <spinlocks>4095</spinlocks>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <stimer_direct>on</stimer_direct>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <tlbflush_direct>on</tlbflush_direct>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <tlbflush_extended>on</tlbflush_extended>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </defaults>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </hyperv>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    <launchSecurity supported='yes'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      <enum name='sectype'>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:        <value>tdx</value>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:      </enum>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:    </launchSecurity>
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  </features>
Dec  5 07:34:42 np0005546954 nova_compute[186219]: </domainCapabilities>
Dec  5 07:34:42 np0005546954 nova_compute[186219]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  5 07:34:42 np0005546954 nova_compute[186219]: 2025-12-05 12:34:42.503 186223 DEBUG nova.virt.libvirt.host [None req-34abcad0-ce45-4330-b001-3bb9c42f3480 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Dec  5 07:34:42 np0005546954 nova_compute[186219]: 2025-12-05 12:34:42.503 186223 INFO nova.virt.libvirt.host [None req-34abcad0-ce45-4330-b001-3bb9c42f3480 - - - - - -] Secure Boot support detected#033[00m
Dec  5 07:34:42 np0005546954 nova_compute[186219]: 2025-12-05 12:34:42.505 186223 INFO nova.virt.libvirt.driver [None req-34abcad0-ce45-4330-b001-3bb9c42f3480 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Dec  5 07:34:42 np0005546954 nova_compute[186219]: 2025-12-05 12:34:42.505 186223 INFO nova.virt.libvirt.driver [None req-34abcad0-ce45-4330-b001-3bb9c42f3480 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Dec  5 07:34:42 np0005546954 nova_compute[186219]: 2025-12-05 12:34:42.515 186223 DEBUG nova.virt.libvirt.driver [None req-34abcad0-ce45-4330-b001-3bb9c42f3480 - - - - - -] cpu compare xml: <cpu match="exact">
Dec  5 07:34:42 np0005546954 nova_compute[186219]:  <model>Nehalem</model>
Dec  5 07:34:42 np0005546954 nova_compute[186219]: </cpu>
Dec  5 07:34:42 np0005546954 nova_compute[186219]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Dec  5 07:34:42 np0005546954 nova_compute[186219]: 2025-12-05 12:34:42.519 186223 DEBUG nova.virt.libvirt.driver [None req-34abcad0-ce45-4330-b001-3bb9c42f3480 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Dec  5 07:34:42 np0005546954 nova_compute[186219]: 2025-12-05 12:34:42.583 186223 INFO nova.virt.node [None req-34abcad0-ce45-4330-b001-3bb9c42f3480 - - - - - -] Determined node identity eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b from /var/lib/nova/compute_id#033[00m
Dec  5 07:34:42 np0005546954 nova_compute[186219]: 2025-12-05 12:34:42.601 186223 WARNING nova.compute.manager [None req-34abcad0-ce45-4330-b001-3bb9c42f3480 - - - - - -] Compute nodes ['eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b'] for host compute-1.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Dec  5 07:34:42 np0005546954 nova_compute[186219]: 2025-12-05 12:34:42.668 186223 INFO nova.compute.manager [None req-34abcad0-ce45-4330-b001-3bb9c42f3480 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Dec  5 07:34:42 np0005546954 nova_compute[186219]: 2025-12-05 12:34:42.702 186223 WARNING nova.compute.manager [None req-34abcad0-ce45-4330-b001-3bb9c42f3480 - - - - - -] No compute node record found for host compute-1.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.#033[00m
Dec  5 07:34:42 np0005546954 nova_compute[186219]: 2025-12-05 12:34:42.702 186223 DEBUG oslo_concurrency.lockutils [None req-34abcad0-ce45-4330-b001-3bb9c42f3480 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:34:42 np0005546954 nova_compute[186219]: 2025-12-05 12:34:42.703 186223 DEBUG oslo_concurrency.lockutils [None req-34abcad0-ce45-4330-b001-3bb9c42f3480 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:34:42 np0005546954 nova_compute[186219]: 2025-12-05 12:34:42.703 186223 DEBUG oslo_concurrency.lockutils [None req-34abcad0-ce45-4330-b001-3bb9c42f3480 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:34:42 np0005546954 nova_compute[186219]: 2025-12-05 12:34:42.703 186223 DEBUG nova.compute.resource_tracker [None req-34abcad0-ce45-4330-b001-3bb9c42f3480 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:34:42 np0005546954 systemd[1]: Starting libvirt nodedev daemon...
Dec  5 07:34:42 np0005546954 systemd[1]: Started libvirt nodedev daemon.
Dec  5 07:34:43 np0005546954 nova_compute[186219]: 2025-12-05 12:34:43.008 186223 WARNING nova.virt.libvirt.driver [None req-34abcad0-ce45-4330-b001-3bb9c42f3480 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:34:43 np0005546954 nova_compute[186219]: 2025-12-05 12:34:43.009 186223 DEBUG nova.compute.resource_tracker [None req-34abcad0-ce45-4330-b001-3bb9c42f3480 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6191MB free_disk=73.5436782836914GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:34:43 np0005546954 nova_compute[186219]: 2025-12-05 12:34:43.009 186223 DEBUG oslo_concurrency.lockutils [None req-34abcad0-ce45-4330-b001-3bb9c42f3480 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:34:43 np0005546954 nova_compute[186219]: 2025-12-05 12:34:43.009 186223 DEBUG oslo_concurrency.lockutils [None req-34abcad0-ce45-4330-b001-3bb9c42f3480 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:34:43 np0005546954 nova_compute[186219]: 2025-12-05 12:34:43.022 186223 WARNING nova.compute.resource_tracker [None req-34abcad0-ce45-4330-b001-3bb9c42f3480 - - - - - -] No compute node record for compute-1.ctlplane.example.com:eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b could not be found.#033[00m
Dec  5 07:34:43 np0005546954 nova_compute[186219]: 2025-12-05 12:34:43.043 186223 INFO nova.compute.resource_tracker [None req-34abcad0-ce45-4330-b001-3bb9c42f3480 - - - - - -] Compute node record created for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com with uuid: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b#033[00m
Dec  5 07:34:43 np0005546954 nova_compute[186219]: 2025-12-05 12:34:43.102 186223 DEBUG nova.compute.resource_tracker [None req-34abcad0-ce45-4330-b001-3bb9c42f3480 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:34:43 np0005546954 nova_compute[186219]: 2025-12-05 12:34:43.102 186223 DEBUG nova.compute.resource_tracker [None req-34abcad0-ce45-4330-b001-3bb9c42f3480 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:34:43 np0005546954 python3.9[187099]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  5 07:34:43 np0005546954 systemd[1]: Stopping nova_compute container...
Dec  5 07:34:43 np0005546954 virtqemud[186730]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Dec  5 07:34:43 np0005546954 virtqemud[186730]: hostname: compute-1
Dec  5 07:34:43 np0005546954 virtqemud[186730]: End of file while reading data: Input/output error
Dec  5 07:34:43 np0005546954 systemd[1]: libpod-315359a1f41c2807b4acaa531eb8dfd4d78461436dbf2825efa04aa7c738dbca.scope: Deactivated successfully.
Dec  5 07:34:43 np0005546954 podman[187103]: 2025-12-05 12:34:43.395427204 +0000 UTC m=+0.060856712 container died 315359a1f41c2807b4acaa531eb8dfd4d78461436dbf2825efa04aa7c738dbca (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.license=GPLv2, config_id=edpm, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec  5 07:34:43 np0005546954 systemd[1]: libpod-315359a1f41c2807b4acaa531eb8dfd4d78461436dbf2825efa04aa7c738dbca.scope: Consumed 2.944s CPU time.
Dec  5 07:34:43 np0005546954 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-315359a1f41c2807b4acaa531eb8dfd4d78461436dbf2825efa04aa7c738dbca-userdata-shm.mount: Deactivated successfully.
Dec  5 07:34:43 np0005546954 systemd[1]: var-lib-containers-storage-overlay-1f31841d93424af2814b3b81009358d18034b65eda328d3a1037107d28056002-merged.mount: Deactivated successfully.
Dec  5 07:34:43 np0005546954 podman[187103]: 2025-12-05 12:34:43.448081258 +0000 UTC m=+0.113510766 container cleanup 315359a1f41c2807b4acaa531eb8dfd4d78461436dbf2825efa04aa7c738dbca (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec  5 07:34:43 np0005546954 podman[187103]: nova_compute
Dec  5 07:34:43 np0005546954 podman[187132]: nova_compute
Dec  5 07:34:43 np0005546954 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Dec  5 07:34:43 np0005546954 systemd[1]: Stopped nova_compute container.
Dec  5 07:34:43 np0005546954 systemd[1]: Starting nova_compute container...
Dec  5 07:34:43 np0005546954 systemd[1]: Started libcrun container.
Dec  5 07:34:43 np0005546954 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f31841d93424af2814b3b81009358d18034b65eda328d3a1037107d28056002/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec  5 07:34:43 np0005546954 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f31841d93424af2814b3b81009358d18034b65eda328d3a1037107d28056002/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec  5 07:34:43 np0005546954 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f31841d93424af2814b3b81009358d18034b65eda328d3a1037107d28056002/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec  5 07:34:43 np0005546954 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f31841d93424af2814b3b81009358d18034b65eda328d3a1037107d28056002/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec  5 07:34:43 np0005546954 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f31841d93424af2814b3b81009358d18034b65eda328d3a1037107d28056002/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec  5 07:34:43 np0005546954 podman[187145]: 2025-12-05 12:34:43.656627001 +0000 UTC m=+0.114319292 container init 315359a1f41c2807b4acaa531eb8dfd4d78461436dbf2825efa04aa7c738dbca (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=nova_compute, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:34:43 np0005546954 podman[187145]: 2025-12-05 12:34:43.662216942 +0000 UTC m=+0.119909203 container start 315359a1f41c2807b4acaa531eb8dfd4d78461436dbf2825efa04aa7c738dbca (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, container_name=nova_compute, io.buildah.version=1.41.3)
Dec  5 07:34:43 np0005546954 nova_compute[187160]: + sudo -E kolla_set_configs
Dec  5 07:34:43 np0005546954 podman[187145]: nova_compute
Dec  5 07:34:43 np0005546954 systemd[1]: Started nova_compute container.
Dec  5 07:34:43 np0005546954 nova_compute[187160]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  5 07:34:43 np0005546954 nova_compute[187160]: INFO:__main__:Validating config file
Dec  5 07:34:43 np0005546954 nova_compute[187160]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  5 07:34:43 np0005546954 nova_compute[187160]: INFO:__main__:Copying service configuration files
Dec  5 07:34:43 np0005546954 nova_compute[187160]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec  5 07:34:43 np0005546954 nova_compute[187160]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec  5 07:34:43 np0005546954 nova_compute[187160]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec  5 07:34:43 np0005546954 nova_compute[187160]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Dec  5 07:34:43 np0005546954 nova_compute[187160]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec  5 07:34:43 np0005546954 nova_compute[187160]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec  5 07:34:43 np0005546954 nova_compute[187160]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Dec  5 07:34:43 np0005546954 nova_compute[187160]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Dec  5 07:34:43 np0005546954 nova_compute[187160]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Dec  5 07:34:43 np0005546954 nova_compute[187160]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Dec  5 07:34:43 np0005546954 nova_compute[187160]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec  5 07:34:43 np0005546954 nova_compute[187160]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec  5 07:34:43 np0005546954 nova_compute[187160]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec  5 07:34:43 np0005546954 nova_compute[187160]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec  5 07:34:43 np0005546954 nova_compute[187160]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec  5 07:34:43 np0005546954 nova_compute[187160]: INFO:__main__:Deleting /etc/ceph
Dec  5 07:34:43 np0005546954 nova_compute[187160]: INFO:__main__:Creating directory /etc/ceph
Dec  5 07:34:43 np0005546954 nova_compute[187160]: INFO:__main__:Setting permission for /etc/ceph
Dec  5 07:34:43 np0005546954 nova_compute[187160]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Dec  5 07:34:43 np0005546954 nova_compute[187160]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec  5 07:34:43 np0005546954 nova_compute[187160]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec  5 07:34:43 np0005546954 nova_compute[187160]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Dec  5 07:34:43 np0005546954 nova_compute[187160]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec  5 07:34:43 np0005546954 nova_compute[187160]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec  5 07:34:43 np0005546954 nova_compute[187160]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec  5 07:34:43 np0005546954 nova_compute[187160]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec  5 07:34:43 np0005546954 nova_compute[187160]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec  5 07:34:43 np0005546954 nova_compute[187160]: INFO:__main__:Writing out command to execute
Dec  5 07:34:43 np0005546954 nova_compute[187160]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec  5 07:34:43 np0005546954 nova_compute[187160]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec  5 07:34:43 np0005546954 nova_compute[187160]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec  5 07:34:43 np0005546954 nova_compute[187160]: ++ cat /run_command
Dec  5 07:34:43 np0005546954 nova_compute[187160]: + CMD=nova-compute
Dec  5 07:34:43 np0005546954 nova_compute[187160]: + ARGS=
Dec  5 07:34:43 np0005546954 nova_compute[187160]: + sudo kolla_copy_cacerts
Dec  5 07:34:43 np0005546954 nova_compute[187160]: + [[ ! -n '' ]]
Dec  5 07:34:43 np0005546954 nova_compute[187160]: + . kolla_extend_start
Dec  5 07:34:43 np0005546954 nova_compute[187160]: Running command: 'nova-compute'
Dec  5 07:34:43 np0005546954 nova_compute[187160]: + echo 'Running command: '\''nova-compute'\'''
Dec  5 07:34:43 np0005546954 nova_compute[187160]: + umask 0022
Dec  5 07:34:43 np0005546954 nova_compute[187160]: + exec nova-compute
Dec  5 07:34:44 np0005546954 python3.9[187323]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec  5 07:34:44 np0005546954 systemd[1]: Started libpod-conmon-51bc0995ce05197f39172c0011f26f6d0e2cad9af4bce8703b7a13fc12787ed3.scope.
Dec  5 07:34:44 np0005546954 systemd[1]: Started libcrun container.
Dec  5 07:34:44 np0005546954 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9e34fcd8147b6b6f6ab5e1556b9f0d32f5a887a4fc580439fa083886e840e748/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Dec  5 07:34:44 np0005546954 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9e34fcd8147b6b6f6ab5e1556b9f0d32f5a887a4fc580439fa083886e840e748/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec  5 07:34:44 np0005546954 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9e34fcd8147b6b6f6ab5e1556b9f0d32f5a887a4fc580439fa083886e840e748/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Dec  5 07:34:44 np0005546954 podman[187349]: 2025-12-05 12:34:44.977931744 +0000 UTC m=+0.143977303 container init 51bc0995ce05197f39172c0011f26f6d0e2cad9af4bce8703b7a13fc12787ed3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=nova_compute_init, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Dec  5 07:34:44 np0005546954 podman[187349]: 2025-12-05 12:34:44.991865885 +0000 UTC m=+0.157911424 container start 51bc0995ce05197f39172c0011f26f6d0e2cad9af4bce8703b7a13fc12787ed3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, org.label-schema.build-date=20251125)
Dec  5 07:34:45 np0005546954 python3.9[187323]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Dec  5 07:34:45 np0005546954 nova_compute_init[187370]: INFO:nova_statedir:Applying nova statedir ownership
Dec  5 07:34:45 np0005546954 nova_compute_init[187370]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Dec  5 07:34:45 np0005546954 nova_compute_init[187370]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Dec  5 07:34:45 np0005546954 nova_compute_init[187370]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Dec  5 07:34:45 np0005546954 nova_compute_init[187370]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Dec  5 07:34:45 np0005546954 nova_compute_init[187370]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Dec  5 07:34:45 np0005546954 nova_compute_init[187370]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Dec  5 07:34:45 np0005546954 nova_compute_init[187370]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Dec  5 07:34:45 np0005546954 nova_compute_init[187370]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Dec  5 07:34:45 np0005546954 nova_compute_init[187370]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Dec  5 07:34:45 np0005546954 nova_compute_init[187370]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Dec  5 07:34:45 np0005546954 nova_compute_init[187370]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Dec  5 07:34:45 np0005546954 nova_compute_init[187370]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Dec  5 07:34:45 np0005546954 nova_compute_init[187370]: INFO:nova_statedir:Nova statedir ownership complete
Dec  5 07:34:45 np0005546954 systemd[1]: libpod-51bc0995ce05197f39172c0011f26f6d0e2cad9af4bce8703b7a13fc12787ed3.scope: Deactivated successfully.
Dec  5 07:34:45 np0005546954 podman[187371]: 2025-12-05 12:34:45.050970239 +0000 UTC m=+0.027798291 container died 51bc0995ce05197f39172c0011f26f6d0e2cad9af4bce8703b7a13fc12787ed3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, container_name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:34:45 np0005546954 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-51bc0995ce05197f39172c0011f26f6d0e2cad9af4bce8703b7a13fc12787ed3-userdata-shm.mount: Deactivated successfully.
Dec  5 07:34:45 np0005546954 systemd[1]: var-lib-containers-storage-overlay-9e34fcd8147b6b6f6ab5e1556b9f0d32f5a887a4fc580439fa083886e840e748-merged.mount: Deactivated successfully.
Dec  5 07:34:45 np0005546954 podman[187384]: 2025-12-05 12:34:45.107415127 +0000 UTC m=+0.046971313 container cleanup 51bc0995ce05197f39172c0011f26f6d0e2cad9af4bce8703b7a13fc12787ed3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, container_name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec  5 07:34:45 np0005546954 systemd[1]: libpod-conmon-51bc0995ce05197f39172c0011f26f6d0e2cad9af4bce8703b7a13fc12787ed3.scope: Deactivated successfully.
Dec  5 07:34:45 np0005546954 systemd[1]: session-25.scope: Deactivated successfully.
Dec  5 07:34:45 np0005546954 systemd[1]: session-25.scope: Consumed 2min 6.264s CPU time.
Dec  5 07:34:45 np0005546954 systemd-logind[789]: Session 25 logged out. Waiting for processes to exit.
Dec  5 07:34:45 np0005546954 systemd-logind[789]: Removed session 25.
Dec  5 07:34:45 np0005546954 nova_compute[187160]: 2025-12-05 12:34:45.870 187164 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  5 07:34:45 np0005546954 nova_compute[187160]: 2025-12-05 12:34:45.871 187164 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  5 07:34:45 np0005546954 nova_compute[187160]: 2025-12-05 12:34:45.871 187164 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  5 07:34:45 np0005546954 nova_compute[187160]: 2025-12-05 12:34:45.871 187164 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.006 187164 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.028 187164 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.028 187164 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.465 187164 INFO nova.virt.driver [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.564 187164 INFO nova.compute.provider_config [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.574 187164 DEBUG oslo_concurrency.lockutils [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.575 187164 DEBUG oslo_concurrency.lockutils [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.575 187164 DEBUG oslo_concurrency.lockutils [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.575 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.576 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.576 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.576 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.576 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.576 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.577 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.577 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.577 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.577 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.577 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.578 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.578 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.578 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.578 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.578 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.578 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.579 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.579 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.579 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.579 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.579 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.579 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.579 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.580 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.580 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.580 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.580 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.580 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.581 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.581 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.581 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.581 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.581 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.581 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.581 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.582 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.582 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.582 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.582 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.582 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.582 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.583 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.583 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.583 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.583 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.583 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.583 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.584 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.584 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.584 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.584 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.584 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.584 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.584 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.585 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.585 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.585 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.585 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.585 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.585 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.586 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.586 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.586 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.586 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.586 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.586 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.586 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.587 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.587 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.587 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.587 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.587 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.587 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.588 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.588 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.588 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.588 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.588 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.588 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.588 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.589 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.589 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.589 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.589 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.589 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.589 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.589 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.590 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.590 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.590 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.590 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.590 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.590 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.590 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.591 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.591 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.591 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.591 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.591 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.591 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.591 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.592 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.592 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.592 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.592 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.592 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.592 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.592 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.593 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.593 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.593 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.593 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.593 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.593 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.593 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.594 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.594 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.594 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.594 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.594 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.594 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.594 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.595 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.595 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.595 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.595 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.595 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.596 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.596 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.596 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.596 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.596 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.596 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.597 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.597 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.597 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.597 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.597 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.598 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.598 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.598 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.598 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.598 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.598 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.598 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.599 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.599 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.599 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.599 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.599 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.599 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.600 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.600 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.600 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.600 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.600 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.600 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.600 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.601 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.601 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.601 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.601 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.601 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.602 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.602 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.602 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.602 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.602 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.602 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.602 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.603 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.603 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.603 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.603 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.603 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.603 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.604 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.604 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.604 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.604 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.604 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.604 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.604 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.605 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.605 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.605 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.605 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.605 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.605 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.605 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.606 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.606 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.606 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.606 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.606 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.606 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.607 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.607 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.607 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.607 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.607 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.607 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.607 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.608 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.608 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.608 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.608 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.608 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.608 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.609 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.609 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.609 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.609 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.609 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.609 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.610 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.610 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.610 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.610 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.610 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.610 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.610 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.611 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.611 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.611 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.611 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.611 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.611 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.612 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.612 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.612 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.612 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.612 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.612 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.613 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.613 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.613 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.613 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.613 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.613 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.613 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.614 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.614 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.614 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.614 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.614 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.614 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.614 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.615 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.615 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.615 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.615 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.615 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.615 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.615 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.616 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.616 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.616 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.616 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.616 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.616 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.616 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.617 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.617 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.617 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.617 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.617 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.617 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.618 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.618 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.618 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.618 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.618 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.618 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.619 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.619 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.619 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.619 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.619 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.619 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.620 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.620 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.620 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.620 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.620 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.620 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.620 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.621 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.621 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.621 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.621 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.621 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.621 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.621 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.622 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.622 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.622 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.622 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.622 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.622 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.622 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.623 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.623 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.623 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.623 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.623 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.623 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.624 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.624 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.624 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.624 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.624 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.624 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.624 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.625 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.625 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.625 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.625 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.625 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.625 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.625 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.626 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.626 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.626 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.626 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.626 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.626 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.626 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.627 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.627 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.627 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.627 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.627 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.627 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.628 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.628 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.628 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.628 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.628 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.628 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.629 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.629 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.629 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.629 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.629 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.629 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.629 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.630 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.630 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.630 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.630 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.630 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.630 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.630 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.631 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.631 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.631 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.631 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.631 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.632 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.632 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.632 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.632 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.632 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.633 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.633 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.633 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.633 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.633 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.633 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.634 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.634 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.634 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.634 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.634 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.634 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.635 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.635 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.635 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.635 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.636 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.636 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.636 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.636 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.636 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.636 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.637 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.637 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.637 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.637 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.637 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.638 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.638 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.638 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.638 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.638 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.638 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.638 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.639 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.639 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.639 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.639 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.639 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.639 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.640 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.640 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.640 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.640 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.640 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.640 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.641 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.641 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.641 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.641 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.641 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.641 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.641 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.642 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.642 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.642 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.642 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.642 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.642 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.643 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.643 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.643 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.643 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.643 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.643 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.644 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.644 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.644 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.644 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.644 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.644 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.644 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.645 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.645 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.645 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.645 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.645 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.645 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.645 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.646 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.646 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.646 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.646 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.646 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.646 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.646 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.647 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.647 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.647 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.647 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.647 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.648 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.648 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.648 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.648 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.648 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.649 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.649 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.649 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.649 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.649 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.649 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.649 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.650 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.650 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.650 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.650 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.650 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.650 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.651 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.651 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.651 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.651 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.651 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.651 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.651 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.652 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.652 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.652 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.652 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.652 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.652 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.653 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.653 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.653 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.653 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.654 187164 WARNING oslo_config.cfg [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec  5 07:34:46 np0005546954 nova_compute[187160]: live_migration_uri is deprecated for removal in favor of two other options that
Dec  5 07:34:46 np0005546954 nova_compute[187160]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec  5 07:34:46 np0005546954 nova_compute[187160]: and ``live_migration_inbound_addr`` respectively.
Dec  5 07:34:46 np0005546954 nova_compute[187160]: ).  Its value may be silently ignored in the future.#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.654 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.654 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.654 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.654 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.654 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.655 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.655 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.655 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.655 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.655 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.655 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.655 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.656 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.656 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.656 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.656 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.656 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.656 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.657 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.657 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.657 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.657 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.657 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.657 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.658 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.658 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.658 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.658 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.658 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.658 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.658 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.659 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.659 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.659 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.659 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.659 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.659 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.660 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.660 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.660 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.660 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.660 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.660 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.661 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.661 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.661 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.661 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.661 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.661 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.662 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.662 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.662 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.662 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.663 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.663 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.663 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.663 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.663 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.664 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.664 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.664 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.664 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.664 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.665 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.665 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.665 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.665 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.665 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.665 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.666 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.666 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.666 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.666 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.666 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.666 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.667 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.667 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.667 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.667 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.667 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.667 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.667 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.668 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.668 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.668 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.668 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.668 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.668 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.669 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.669 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.669 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.669 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.669 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.669 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.669 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.670 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.670 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.670 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.670 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.670 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.670 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.670 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.671 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.671 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.671 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.671 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.672 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.672 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.672 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.672 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.672 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.672 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.673 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.673 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.673 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.673 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.673 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.674 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.674 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.674 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.674 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.674 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.675 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.675 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.675 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.675 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.675 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.675 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.676 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.676 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.676 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.676 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.676 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.676 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.677 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.677 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.677 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.677 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.677 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.677 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.678 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.678 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.678 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.678 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.678 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.679 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.679 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.679 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.679 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.679 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.680 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.680 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.680 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.680 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.680 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.680 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.681 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.681 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.681 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.681 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.681 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.681 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.682 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.682 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.682 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.682 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.682 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.682 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.682 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.683 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.683 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.683 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.683 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.683 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.683 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.684 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.684 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.684 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.684 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.684 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.685 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.685 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.685 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.685 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.685 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.685 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.686 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.686 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.686 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.686 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.686 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.686 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.686 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.686 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.687 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.687 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.687 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.687 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.687 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.687 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.688 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.688 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.688 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.688 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.688 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.688 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.688 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.689 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.689 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.689 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.689 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.689 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.689 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.689 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.690 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.690 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.690 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.690 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.690 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.690 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.691 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.691 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.691 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.691 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.691 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.691 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.691 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.692 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.692 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.692 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.692 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.692 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.692 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.692 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.693 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.693 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.693 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.693 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.693 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.693 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.693 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.694 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.694 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.694 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.694 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.694 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.694 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.694 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.695 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.695 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.695 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.695 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.695 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.695 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.696 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.696 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.696 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.696 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.696 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.696 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.696 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.697 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.697 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.697 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.697 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.697 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.697 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.698 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.698 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.698 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.698 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.698 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.698 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.699 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.699 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.699 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.699 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.699 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.700 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.700 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.700 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.700 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.700 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.701 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.701 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.701 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.701 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.701 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.701 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.702 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.702 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.702 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.702 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.702 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.703 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.703 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.703 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.703 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.703 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.704 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.704 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.704 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.704 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.704 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.704 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.705 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.705 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.705 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.705 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.705 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.705 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.706 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.706 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.706 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.706 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.706 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.706 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.707 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.707 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.707 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.707 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.707 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.707 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.707 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.708 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.708 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.708 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.708 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.708 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.708 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.708 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.709 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.709 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.709 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.709 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.709 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.709 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.709 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.710 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.710 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.710 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.710 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.710 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.710 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.711 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.711 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.711 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.711 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.711 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.711 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.711 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.712 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.712 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.712 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.712 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.712 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.712 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.712 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.712 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.713 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.713 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.713 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.713 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.713 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.713 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.713 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.714 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.714 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.714 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.714 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.714 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.714 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.714 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.715 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.715 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.715 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.715 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.715 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.715 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.716 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.716 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.716 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.716 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.716 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.716 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.716 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.717 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.717 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.717 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.717 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.717 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.718 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.718 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.718 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.718 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.718 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.718 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.719 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.719 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.719 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.719 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.719 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.719 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.720 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.720 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.720 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.720 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.720 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.720 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.721 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.721 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.721 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.721 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.721 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.722 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.722 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.722 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.722 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.722 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.722 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.723 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.723 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.723 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.723 187164 DEBUG oslo_service.service [None req-8732048a-f0bc-4f08-bc20-d2290a55e8fd - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.724 187164 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.738 187164 INFO nova.virt.node [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Determined node identity eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b from /var/lib/nova/compute_id#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.739 187164 DEBUG nova.virt.libvirt.host [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.740 187164 DEBUG nova.virt.libvirt.host [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.740 187164 DEBUG nova.virt.libvirt.host [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.741 187164 DEBUG nova.virt.libvirt.host [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.757 187164 DEBUG nova.virt.libvirt.host [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fc52fa7c640> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.761 187164 DEBUG nova.virt.libvirt.host [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fc52fa7c640> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.763 187164 INFO nova.virt.libvirt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Connection event '1' reason 'None'#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.775 187164 DEBUG nova.virt.libvirt.volume.mount [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.777 187164 INFO nova.virt.libvirt.host [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Libvirt host capabilities <capabilities>
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  <host>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <uuid>a16ad662-d426-4c8c-9ec3-e00cbbaff345</uuid>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <cpu>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <arch>x86_64</arch>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model>EPYC-Rome-v4</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <vendor>AMD</vendor>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <microcode version='16777317'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <signature family='23' model='49' stepping='0'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <maxphysaddr mode='emulate' bits='40'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature name='x2apic'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature name='tsc-deadline'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature name='osxsave'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature name='hypervisor'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature name='tsc_adjust'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature name='spec-ctrl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature name='stibp'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature name='arch-capabilities'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature name='ssbd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature name='cmp_legacy'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature name='topoext'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature name='virt-ssbd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature name='lbrv'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature name='tsc-scale'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature name='vmcb-clean'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature name='pause-filter'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature name='pfthreshold'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature name='svme-addr-chk'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature name='rdctl-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature name='skip-l1dfl-vmentry'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature name='mds-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature name='pschange-mc-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <pages unit='KiB' size='4'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <pages unit='KiB' size='2048'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <pages unit='KiB' size='1048576'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </cpu>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <power_management>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <suspend_mem/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <suspend_disk/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <suspend_hybrid/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </power_management>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <iommu support='no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <migration_features>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <live/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <uri_transports>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <uri_transport>tcp</uri_transport>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <uri_transport>rdma</uri_transport>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </uri_transports>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </migration_features>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <topology>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <cells num='1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <cell id='0'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:          <memory unit='KiB'>7864320</memory>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:          <pages unit='KiB' size='4'>1966080</pages>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:          <pages unit='KiB' size='2048'>0</pages>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:          <pages unit='KiB' size='1048576'>0</pages>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:          <distances>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:            <sibling id='0' value='10'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:          </distances>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:          <cpus num='8'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:          </cpus>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        </cell>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </cells>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </topology>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <cache>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </cache>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <secmodel>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model>selinux</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <doi>0</doi>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </secmodel>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <secmodel>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model>dac</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <doi>0</doi>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <baselabel type='kvm'>+107:+107</baselabel>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <baselabel type='qemu'>+107:+107</baselabel>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </secmodel>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  </host>
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  <guest>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <os_type>hvm</os_type>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <arch name='i686'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <wordsize>32</wordsize>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <domain type='qemu'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <domain type='kvm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </arch>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <features>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <pae/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <nonpae/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <acpi default='on' toggle='yes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <apic default='on' toggle='no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <cpuselection/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <deviceboot/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <disksnapshot default='on' toggle='no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <externalSnapshot/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </features>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  </guest>
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  <guest>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <os_type>hvm</os_type>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <arch name='x86_64'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <wordsize>64</wordsize>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <domain type='qemu'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <domain type='kvm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </arch>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <features>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <acpi default='on' toggle='yes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <apic default='on' toggle='no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <cpuselection/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <deviceboot/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <disksnapshot default='on' toggle='no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <externalSnapshot/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </features>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  </guest>
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 
Dec  5 07:34:46 np0005546954 nova_compute[187160]: </capabilities>
Dec  5 07:34:46 np0005546954 nova_compute[187160]: #033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.794 187164 DEBUG nova.virt.libvirt.host [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.800 187164 DEBUG nova.virt.libvirt.host [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec  5 07:34:46 np0005546954 nova_compute[187160]: <domainCapabilities>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  <path>/usr/libexec/qemu-kvm</path>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  <domain>kvm</domain>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  <machine>pc-q35-rhel9.8.0</machine>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  <arch>i686</arch>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  <vcpu max='4096'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  <iothreads supported='yes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  <os supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <enum name='firmware'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <loader supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='type'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>rom</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>pflash</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='readonly'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>yes</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>no</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='secure'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>no</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </loader>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  </os>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  <cpu>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <mode name='host-passthrough' supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='hostPassthroughMigratable'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>on</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>off</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </mode>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <mode name='maximum' supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='maximumMigratable'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>on</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>off</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </mode>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <mode name='host-model' supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <vendor>AMD</vendor>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='x2apic'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='tsc-deadline'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='hypervisor'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='tsc_adjust'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='spec-ctrl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='stibp'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='ssbd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='cmp_legacy'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='overflow-recov'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='succor'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='ibrs'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='amd-ssbd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='virt-ssbd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='lbrv'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='tsc-scale'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='vmcb-clean'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='flushbyasid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='pause-filter'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='pfthreshold'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='svme-addr-chk'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='disable' name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </mode>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <mode name='custom' supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Broadwell'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Broadwell-IBRS'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Broadwell-noTSX'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Broadwell-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Broadwell-v2'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Broadwell-v3'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Broadwell-v4'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Cascadelake-Server'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Cascadelake-Server-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Cascadelake-Server-v2'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Cascadelake-Server-v3'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Cascadelake-Server-v4'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Cascadelake-Server-v5'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Cooperlake'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Cooperlake-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Cooperlake-v2'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Denverton'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='mpx'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Denverton-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='mpx'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Denverton-v2'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Denverton-v3'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Dhyana-v2'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='EPYC-Genoa'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amd-psfd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='auto-ibrs'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='no-nested-data-bp'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='null-sel-clr-base'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='stibp-always-on'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='EPYC-Genoa-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amd-psfd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='auto-ibrs'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='no-nested-data-bp'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='null-sel-clr-base'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='stibp-always-on'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='EPYC-Milan'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='EPYC-Milan-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='EPYC-Milan-v2'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amd-psfd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='no-nested-data-bp'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='null-sel-clr-base'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='stibp-always-on'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='EPYC-Rome'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='EPYC-Rome-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='EPYC-Rome-v2'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='EPYC-Rome-v3'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='EPYC-v3'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='EPYC-v4'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='GraniteRapids'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-fp16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-int8'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-tile'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx-vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-fp16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fbsdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrc'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrs'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fzrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='mcdt-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pbrsb-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='prefetchiti'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='psdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='serialize'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xfd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='GraniteRapids-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-fp16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-int8'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-tile'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx-vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-fp16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fbsdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrc'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrs'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fzrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='mcdt-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pbrsb-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='prefetchiti'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='psdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='serialize'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xfd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='GraniteRapids-v2'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-fp16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-int8'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-tile'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx-vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx10'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx10-128'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx10-256'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx10-512'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-fp16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='cldemote'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fbsdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrc'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrs'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fzrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='mcdt-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='movdir64b'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='movdiri'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pbrsb-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='prefetchiti'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='psdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='serialize'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ss'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xfd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Haswell'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Haswell-IBRS'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Haswell-noTSX'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Haswell-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Haswell-v2'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Haswell-v3'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Haswell-v4'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Icelake-Server'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Icelake-Server-noTSX'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Icelake-Server-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Icelake-Server-v2'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Icelake-Server-v3'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Icelake-Server-v4'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Icelake-Server-v5'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Icelake-Server-v6'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Icelake-Server-v7'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='IvyBridge'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='IvyBridge-IBRS'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='IvyBridge-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='IvyBridge-v2'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='KnightsMill'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-4fmaps'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-4vnniw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512er'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512pf'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ss'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='KnightsMill-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-4fmaps'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-4vnniw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512er'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512pf'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ss'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Opteron_G4'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fma4'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xop'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Opteron_G4-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fma4'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xop'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Opteron_G5'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fma4'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='tbm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xop'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Opteron_G5-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fma4'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='tbm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xop'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='SapphireRapids'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-int8'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-tile'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx-vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-fp16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrc'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrs'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fzrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='serialize'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xfd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='SapphireRapids-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-int8'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-tile'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx-vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-fp16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrc'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrs'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fzrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='serialize'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xfd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='SapphireRapids-v2'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-int8'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-tile'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx-vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-fp16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fbsdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrc'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrs'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fzrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='psdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='serialize'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xfd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='SapphireRapids-v3'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-int8'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-tile'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx-vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-fp16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='cldemote'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fbsdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrc'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrs'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fzrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='movdir64b'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='movdiri'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='psdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='serialize'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ss'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xfd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='SierraForest'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx-ifma'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx-ne-convert'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx-vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx-vnni-int8'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='cmpccxadd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fbsdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrs'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='mcdt-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pbrsb-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='psdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='serialize'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='SierraForest-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx-ifma'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx-ne-convert'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx-vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx-vnni-int8'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='cmpccxadd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fbsdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrs'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='mcdt-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pbrsb-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='psdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='serialize'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Client'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Client-IBRS'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Client-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Client-v2'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Client-v3'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Client-v4'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Server'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Server-IBRS'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Server-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Server-v2'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Server-v3'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Server-v4'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Server-v5'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Snowridge'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='cldemote'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='core-capability'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='movdir64b'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='movdiri'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='mpx'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='split-lock-detect'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Snowridge-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='cldemote'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='core-capability'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='movdir64b'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='movdiri'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='mpx'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='split-lock-detect'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Snowridge-v2'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='cldemote'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='core-capability'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='movdir64b'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='movdiri'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='split-lock-detect'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Snowridge-v3'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='cldemote'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='core-capability'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='movdir64b'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='movdiri'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='split-lock-detect'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Snowridge-v4'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='cldemote'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='movdir64b'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='movdiri'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='athlon'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='3dnow'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='3dnowext'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='athlon-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='3dnow'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='3dnowext'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='core2duo'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ss'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='core2duo-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ss'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='coreduo'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ss'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='coreduo-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ss'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='n270'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ss'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='n270-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ss'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='phenom'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='3dnow'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='3dnowext'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='phenom-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='3dnow'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='3dnowext'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </mode>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  </cpu>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  <memoryBacking supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <enum name='sourceType'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <value>file</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <value>anonymous</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <value>memfd</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  </memoryBacking>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  <devices>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <disk supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='diskDevice'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>disk</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>cdrom</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>floppy</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>lun</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='bus'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>fdc</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>scsi</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>virtio</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>usb</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>sata</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='model'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>virtio</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>virtio-transitional</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>virtio-non-transitional</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </disk>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <graphics supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='type'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>vnc</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>egl-headless</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>dbus</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </graphics>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <video supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='modelType'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>vga</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>cirrus</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>virtio</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>none</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>bochs</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>ramfb</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </video>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <hostdev supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='mode'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>subsystem</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='startupPolicy'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>default</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>mandatory</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>requisite</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>optional</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='subsysType'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>usb</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>pci</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>scsi</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='capsType'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='pciBackend'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </hostdev>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <rng supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='model'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>virtio</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>virtio-transitional</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>virtio-non-transitional</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='backendModel'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>random</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>egd</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>builtin</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </rng>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <filesystem supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='driverType'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>path</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>handle</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>virtiofs</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </filesystem>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <tpm supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='model'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>tpm-tis</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>tpm-crb</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='backendModel'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>emulator</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>external</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='backendVersion'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>2.0</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </tpm>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <redirdev supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='bus'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>usb</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </redirdev>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <channel supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='type'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>pty</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>unix</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </channel>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <crypto supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='model'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='type'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>qemu</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='backendModel'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>builtin</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </crypto>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <interface supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='backendType'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>default</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>passt</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </interface>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <panic supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='model'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>isa</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>hyperv</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </panic>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <console supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='type'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>null</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>vc</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>pty</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>dev</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>file</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>pipe</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>stdio</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>udp</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>tcp</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>unix</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>qemu-vdagent</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>dbus</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </console>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  </devices>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  <features>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <gic supported='no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <vmcoreinfo supported='yes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <genid supported='yes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <backingStoreInput supported='yes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <backup supported='yes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <async-teardown supported='yes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <ps2 supported='yes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <sev supported='no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <sgx supported='no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <hyperv supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='features'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>relaxed</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>vapic</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>spinlocks</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>vpindex</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>runtime</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>synic</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>stimer</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>reset</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>vendor_id</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>frequencies</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>reenlightenment</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>tlbflush</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>ipi</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>avic</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>emsr_bitmap</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>xmm_input</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <defaults>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <spinlocks>4095</spinlocks>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <stimer_direct>on</stimer_direct>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <tlbflush_direct>on</tlbflush_direct>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <tlbflush_extended>on</tlbflush_extended>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </defaults>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </hyperv>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <launchSecurity supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='sectype'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>tdx</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </launchSecurity>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  </features>
Dec  5 07:34:46 np0005546954 nova_compute[187160]: </domainCapabilities>
Dec  5 07:34:46 np0005546954 nova_compute[187160]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.807 187164 DEBUG nova.virt.libvirt.host [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec  5 07:34:46 np0005546954 nova_compute[187160]: <domainCapabilities>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  <path>/usr/libexec/qemu-kvm</path>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  <domain>kvm</domain>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  <machine>pc-i440fx-rhel7.6.0</machine>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  <arch>i686</arch>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  <vcpu max='240'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  <iothreads supported='yes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  <os supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <enum name='firmware'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <loader supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='type'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>rom</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>pflash</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='readonly'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>yes</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>no</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='secure'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>no</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </loader>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  </os>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  <cpu>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <mode name='host-passthrough' supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='hostPassthroughMigratable'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>on</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>off</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </mode>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <mode name='maximum' supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='maximumMigratable'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>on</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>off</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </mode>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <mode name='host-model' supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <vendor>AMD</vendor>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='x2apic'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='tsc-deadline'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='hypervisor'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='tsc_adjust'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='spec-ctrl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='stibp'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='ssbd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='cmp_legacy'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='overflow-recov'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='succor'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='ibrs'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='amd-ssbd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='virt-ssbd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='lbrv'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='tsc-scale'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='vmcb-clean'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='flushbyasid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='pause-filter'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='pfthreshold'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='svme-addr-chk'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='disable' name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </mode>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <mode name='custom' supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Broadwell'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Broadwell-IBRS'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Broadwell-noTSX'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Broadwell-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Broadwell-v2'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Broadwell-v3'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Broadwell-v4'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Cascadelake-Server'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Cascadelake-Server-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Cascadelake-Server-v2'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Cascadelake-Server-v3'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Cascadelake-Server-v4'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Cascadelake-Server-v5'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Cooperlake'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Cooperlake-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Cooperlake-v2'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Denverton'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='mpx'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Denverton-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='mpx'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Denverton-v2'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Denverton-v3'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Dhyana-v2'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='EPYC-Genoa'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amd-psfd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='auto-ibrs'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='no-nested-data-bp'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='null-sel-clr-base'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='stibp-always-on'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='EPYC-Genoa-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amd-psfd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='auto-ibrs'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='no-nested-data-bp'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='null-sel-clr-base'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='stibp-always-on'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='EPYC-Milan'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='EPYC-Milan-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='EPYC-Milan-v2'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amd-psfd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='no-nested-data-bp'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='null-sel-clr-base'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='stibp-always-on'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='EPYC-Rome'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='EPYC-Rome-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='EPYC-Rome-v2'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='EPYC-Rome-v3'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='EPYC-v3'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='EPYC-v4'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='GraniteRapids'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-fp16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-int8'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-tile'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx-vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-fp16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fbsdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrc'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrs'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fzrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='mcdt-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pbrsb-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='prefetchiti'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='psdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='serialize'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xfd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='GraniteRapids-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-fp16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-int8'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-tile'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx-vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-fp16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fbsdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrc'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrs'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fzrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='mcdt-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pbrsb-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='prefetchiti'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='psdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='serialize'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xfd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='GraniteRapids-v2'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-fp16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-int8'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-tile'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx-vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx10'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx10-128'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx10-256'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx10-512'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-fp16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='cldemote'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fbsdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrc'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrs'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fzrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='mcdt-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='movdir64b'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='movdiri'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pbrsb-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='prefetchiti'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='psdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='serialize'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ss'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xfd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Haswell'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Haswell-IBRS'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Haswell-noTSX'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Haswell-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Haswell-v2'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Haswell-v3'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Haswell-v4'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Icelake-Server'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Icelake-Server-noTSX'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Icelake-Server-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Icelake-Server-v2'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Icelake-Server-v3'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Icelake-Server-v4'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Icelake-Server-v5'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Icelake-Server-v6'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Icelake-Server-v7'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='IvyBridge'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='IvyBridge-IBRS'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='IvyBridge-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='IvyBridge-v2'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='KnightsMill'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-4fmaps'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-4vnniw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512er'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512pf'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ss'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='KnightsMill-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-4fmaps'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-4vnniw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512er'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512pf'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ss'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Opteron_G4'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fma4'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xop'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Opteron_G4-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fma4'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xop'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Opteron_G5'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fma4'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='tbm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xop'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Opteron_G5-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fma4'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='tbm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xop'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='SapphireRapids'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-int8'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-tile'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx-vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-fp16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrc'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrs'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fzrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='serialize'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xfd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='SapphireRapids-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-int8'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-tile'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx-vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-fp16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrc'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrs'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fzrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='serialize'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xfd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='SapphireRapids-v2'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-int8'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-tile'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx-vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-fp16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fbsdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrc'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrs'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fzrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='psdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='serialize'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xfd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='SapphireRapids-v3'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-int8'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-tile'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx-vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-fp16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='cldemote'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fbsdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrc'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrs'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fzrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='movdir64b'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='movdiri'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='psdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='serialize'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ss'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xfd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='SierraForest'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx-ifma'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx-ne-convert'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx-vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx-vnni-int8'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='cmpccxadd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fbsdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrs'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='mcdt-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pbrsb-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='psdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='serialize'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='SierraForest-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx-ifma'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx-ne-convert'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx-vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx-vnni-int8'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='cmpccxadd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fbsdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrs'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='mcdt-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pbrsb-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='psdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='serialize'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Client'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Client-IBRS'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Client-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Client-v2'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Client-v3'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Client-v4'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Server'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Server-IBRS'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Server-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Server-v2'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Server-v3'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Server-v4'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Server-v5'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Snowridge'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='cldemote'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='core-capability'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='movdir64b'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='movdiri'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='mpx'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='split-lock-detect'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Snowridge-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='cldemote'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='core-capability'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='movdir64b'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='movdiri'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='mpx'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='split-lock-detect'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Snowridge-v2'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='cldemote'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='core-capability'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='movdir64b'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='movdiri'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='split-lock-detect'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Snowridge-v3'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='cldemote'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='core-capability'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='movdir64b'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='movdiri'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='split-lock-detect'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Snowridge-v4'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='cldemote'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='movdir64b'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='movdiri'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='athlon'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='3dnow'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='3dnowext'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='athlon-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='3dnow'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='3dnowext'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='core2duo'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ss'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='core2duo-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ss'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='coreduo'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ss'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='coreduo-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ss'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='n270'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ss'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='n270-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ss'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='phenom'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='3dnow'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='3dnowext'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='phenom-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='3dnow'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='3dnowext'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </mode>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  </cpu>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  <memoryBacking supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <enum name='sourceType'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <value>file</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <value>anonymous</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <value>memfd</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  </memoryBacking>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  <devices>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <disk supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='diskDevice'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>disk</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>cdrom</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>floppy</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>lun</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='bus'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>ide</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>fdc</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>scsi</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>virtio</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>usb</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>sata</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='model'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>virtio</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>virtio-transitional</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>virtio-non-transitional</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </disk>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <graphics supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='type'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>vnc</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>egl-headless</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>dbus</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </graphics>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <video supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='modelType'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>vga</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>cirrus</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>virtio</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>none</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>bochs</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>ramfb</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </video>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <hostdev supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='mode'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>subsystem</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='startupPolicy'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>default</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>mandatory</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>requisite</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>optional</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='subsysType'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>usb</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>pci</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>scsi</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='capsType'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='pciBackend'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </hostdev>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <rng supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='model'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>virtio</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>virtio-transitional</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>virtio-non-transitional</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='backendModel'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>random</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>egd</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>builtin</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </rng>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <filesystem supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='driverType'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>path</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>handle</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>virtiofs</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </filesystem>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <tpm supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='model'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>tpm-tis</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>tpm-crb</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='backendModel'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>emulator</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>external</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='backendVersion'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>2.0</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </tpm>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <redirdev supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='bus'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>usb</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </redirdev>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <channel supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='type'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>pty</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>unix</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </channel>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <crypto supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='model'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='type'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>qemu</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='backendModel'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>builtin</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </crypto>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <interface supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='backendType'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>default</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>passt</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </interface>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <panic supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='model'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>isa</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>hyperv</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </panic>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <console supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='type'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>null</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>vc</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>pty</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>dev</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>file</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>pipe</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>stdio</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>udp</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>tcp</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>unix</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>qemu-vdagent</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>dbus</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </console>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  </devices>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  <features>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <gic supported='no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <vmcoreinfo supported='yes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <genid supported='yes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <backingStoreInput supported='yes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <backup supported='yes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <async-teardown supported='yes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <ps2 supported='yes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <sev supported='no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <sgx supported='no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <hyperv supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='features'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>relaxed</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>vapic</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>spinlocks</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>vpindex</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>runtime</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>synic</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>stimer</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>reset</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>vendor_id</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>frequencies</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>reenlightenment</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>tlbflush</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>ipi</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>avic</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>emsr_bitmap</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>xmm_input</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <defaults>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <spinlocks>4095</spinlocks>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <stimer_direct>on</stimer_direct>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <tlbflush_direct>on</tlbflush_direct>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <tlbflush_extended>on</tlbflush_extended>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </defaults>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </hyperv>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <launchSecurity supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='sectype'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>tdx</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </launchSecurity>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  </features>
Dec  5 07:34:46 np0005546954 nova_compute[187160]: </domainCapabilities>
Dec  5 07:34:46 np0005546954 nova_compute[187160]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.837 187164 DEBUG nova.virt.libvirt.host [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Dec  5 07:34:46 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.844 187164 DEBUG nova.virt.libvirt.host [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec  5 07:34:46 np0005546954 nova_compute[187160]: <domainCapabilities>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  <path>/usr/libexec/qemu-kvm</path>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  <domain>kvm</domain>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  <machine>pc-q35-rhel9.8.0</machine>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  <arch>x86_64</arch>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  <vcpu max='4096'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  <iothreads supported='yes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  <os supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <enum name='firmware'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <value>efi</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <loader supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='type'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>rom</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>pflash</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='readonly'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>yes</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>no</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='secure'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>yes</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>no</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </loader>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  </os>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  <cpu>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <mode name='host-passthrough' supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='hostPassthroughMigratable'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>on</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>off</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </mode>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <mode name='maximum' supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='maximumMigratable'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>on</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>off</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </mode>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <mode name='host-model' supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <vendor>AMD</vendor>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='x2apic'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='tsc-deadline'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='hypervisor'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='tsc_adjust'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='spec-ctrl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='stibp'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='ssbd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='cmp_legacy'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='overflow-recov'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='succor'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='ibrs'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='amd-ssbd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='virt-ssbd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='lbrv'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='tsc-scale'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='vmcb-clean'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='flushbyasid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='pause-filter'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='pfthreshold'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='svme-addr-chk'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <feature policy='disable' name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </mode>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <mode name='custom' supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Broadwell'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Broadwell-IBRS'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Broadwell-noTSX'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Broadwell-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Broadwell-v2'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Broadwell-v3'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Broadwell-v4'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Cascadelake-Server'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Cascadelake-Server-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Cascadelake-Server-v2'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Cascadelake-Server-v3'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Cascadelake-Server-v4'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Cascadelake-Server-v5'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Cooperlake'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Cooperlake-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Cooperlake-v2'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Denverton'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='mpx'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Denverton-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='mpx'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Denverton-v2'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Denverton-v3'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Dhyana-v2'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='EPYC-Genoa'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amd-psfd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='auto-ibrs'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='no-nested-data-bp'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='null-sel-clr-base'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='stibp-always-on'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='EPYC-Genoa-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amd-psfd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='auto-ibrs'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='no-nested-data-bp'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='null-sel-clr-base'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='stibp-always-on'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='EPYC-Milan'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='EPYC-Milan-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='EPYC-Milan-v2'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amd-psfd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='no-nested-data-bp'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='null-sel-clr-base'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='stibp-always-on'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='EPYC-Rome'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='EPYC-Rome-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='EPYC-Rome-v2'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='EPYC-Rome-v3'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='EPYC-v3'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='EPYC-v4'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='GraniteRapids'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-fp16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-int8'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-tile'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx-vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-fp16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fbsdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrc'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrs'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fzrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='mcdt-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pbrsb-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='prefetchiti'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='psdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='serialize'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xfd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='GraniteRapids-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-fp16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-int8'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-tile'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx-vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-fp16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fbsdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrc'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrs'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fzrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='mcdt-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pbrsb-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='prefetchiti'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='psdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='serialize'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xfd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='GraniteRapids-v2'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-fp16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-int8'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-tile'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx-vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx10'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx10-128'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx10-256'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx10-512'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-fp16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='cldemote'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fbsdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrc'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrs'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fzrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='mcdt-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='movdir64b'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='movdiri'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pbrsb-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='prefetchiti'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='psdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='serialize'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ss'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xfd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Haswell'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Haswell-IBRS'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Haswell-noTSX'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Haswell-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Haswell-v2'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Haswell-v3'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Haswell-v4'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Icelake-Server'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Icelake-Server-noTSX'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Icelake-Server-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Icelake-Server-v2'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Icelake-Server-v3'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Icelake-Server-v4'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Icelake-Server-v5'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Icelake-Server-v6'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Icelake-Server-v7'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='IvyBridge'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='IvyBridge-IBRS'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='IvyBridge-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='IvyBridge-v2'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='KnightsMill'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-4fmaps'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-4vnniw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512er'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512pf'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ss'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='KnightsMill-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-4fmaps'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-4vnniw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512er'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512pf'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ss'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Opteron_G4'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fma4'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xop'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Opteron_G4-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fma4'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xop'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Opteron_G5'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fma4'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='tbm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xop'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Opteron_G5-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fma4'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='tbm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xop'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='SapphireRapids'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-int8'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-tile'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx-vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-fp16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrc'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrs'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fzrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='serialize'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xfd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='SapphireRapids-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-int8'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-tile'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx-vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-fp16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrc'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrs'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fzrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='serialize'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xfd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='SapphireRapids-v2'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-int8'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-tile'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx-vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-fp16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fbsdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrc'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrs'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fzrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='psdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='serialize'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xfd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='SapphireRapids-v3'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-int8'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='amx-tile'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx-vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-fp16'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='cldemote'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fbsdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrc'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrs'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fzrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='movdir64b'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='movdiri'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='psdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='serialize'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ss'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xfd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='SierraForest'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx-ifma'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx-ne-convert'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx-vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx-vnni-int8'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='cmpccxadd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fbsdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrs'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='mcdt-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pbrsb-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='psdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='serialize'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='SierraForest-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx-ifma'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx-ne-convert'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx-vnni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx-vnni-int8'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='cmpccxadd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fbsdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='fsrs'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='mcdt-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pbrsb-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='psdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='serialize'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Client'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Client-IBRS'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Client-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Client-v2'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Client-v3'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Client-v4'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Server'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Server-IBRS'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Server-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Server-v2'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Server-v3'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Server-v4'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Server-v5'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Snowridge'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='cldemote'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='core-capability'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='movdir64b'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='movdiri'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='mpx'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='split-lock-detect'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Snowridge-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='cldemote'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='core-capability'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='movdir64b'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='movdiri'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='mpx'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='split-lock-detect'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Snowridge-v2'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='cldemote'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='core-capability'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='movdir64b'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='movdiri'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='split-lock-detect'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Snowridge-v3'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='cldemote'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='core-capability'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='movdir64b'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='movdiri'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='split-lock-detect'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='Snowridge-v4'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='cldemote'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='movdir64b'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='movdiri'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='athlon'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='3dnow'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='3dnowext'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='athlon-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='3dnow'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='3dnowext'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='core2duo'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ss'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='core2duo-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ss'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='coreduo'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ss'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='coreduo-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ss'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='n270'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ss'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='n270-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='ss'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='phenom'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='3dnow'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='3dnowext'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <blockers model='phenom-v1'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='3dnow'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <feature name='3dnowext'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </mode>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  </cpu>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  <memoryBacking supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <enum name='sourceType'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <value>file</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <value>anonymous</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <value>memfd</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  </memoryBacking>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:  <devices>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <disk supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='diskDevice'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>disk</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>cdrom</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>floppy</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>lun</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='bus'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>fdc</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>scsi</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>virtio</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>usb</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>sata</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='model'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>virtio</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>virtio-transitional</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>virtio-non-transitional</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </disk>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <graphics supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='type'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>vnc</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>egl-headless</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>dbus</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </graphics>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <video supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='modelType'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>vga</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>cirrus</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>virtio</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>none</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>bochs</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>ramfb</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </video>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <hostdev supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='mode'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>subsystem</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='startupPolicy'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>default</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>mandatory</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>requisite</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>optional</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='subsysType'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>usb</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>pci</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>scsi</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='capsType'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='pciBackend'/>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </hostdev>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <rng supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='model'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>virtio</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>virtio-transitional</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>virtio-non-transitional</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='backendModel'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>random</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>egd</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>builtin</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </rng>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <filesystem supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='driverType'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>path</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>handle</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>virtiofs</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </filesystem>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <tpm supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='model'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>tpm-tis</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>tpm-crb</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='backendModel'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>emulator</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>external</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='backendVersion'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>2.0</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </tpm>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <redirdev supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='bus'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>usb</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </redirdev>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <channel supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='type'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>pty</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:        <value>unix</value>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    </channel>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:    <crypto supported='yes'>
Dec  5 07:34:46 np0005546954 nova_compute[187160]:      <enum name='model'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <enum name='type'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>qemu</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <enum name='backendModel'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>builtin</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    </crypto>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    <interface supported='yes'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <enum name='backendType'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>default</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>passt</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    </interface>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    <panic supported='yes'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <enum name='model'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>isa</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>hyperv</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    </panic>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    <console supported='yes'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <enum name='type'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>null</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>vc</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>pty</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>dev</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>file</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>pipe</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>stdio</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>udp</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>tcp</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>unix</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>qemu-vdagent</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>dbus</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    </console>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:  </devices>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:  <features>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    <gic supported='no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    <vmcoreinfo supported='yes'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    <genid supported='yes'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    <backingStoreInput supported='yes'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    <backup supported='yes'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    <async-teardown supported='yes'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    <ps2 supported='yes'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    <sev supported='no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    <sgx supported='no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    <hyperv supported='yes'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <enum name='features'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>relaxed</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>vapic</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>spinlocks</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>vpindex</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>runtime</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>synic</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>stimer</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>reset</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>vendor_id</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>frequencies</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>reenlightenment</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>tlbflush</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>ipi</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>avic</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>emsr_bitmap</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>xmm_input</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <defaults>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <spinlocks>4095</spinlocks>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <stimer_direct>on</stimer_direct>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <tlbflush_direct>on</tlbflush_direct>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <tlbflush_extended>on</tlbflush_extended>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </defaults>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    </hyperv>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    <launchSecurity supported='yes'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <enum name='sectype'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>tdx</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    </launchSecurity>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:  </features>
Dec  5 07:34:47 np0005546954 nova_compute[187160]: </domainCapabilities>
Dec  5 07:34:47 np0005546954 nova_compute[187160]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  5 07:34:47 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.907 187164 DEBUG nova.virt.libvirt.host [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec  5 07:34:47 np0005546954 nova_compute[187160]: <domainCapabilities>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:  <path>/usr/libexec/qemu-kvm</path>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:  <domain>kvm</domain>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:  <machine>pc-i440fx-rhel7.6.0</machine>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:  <arch>x86_64</arch>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:  <vcpu max='240'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:  <iothreads supported='yes'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:  <os supported='yes'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    <enum name='firmware'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    <loader supported='yes'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <enum name='type'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>rom</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>pflash</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <enum name='readonly'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>yes</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>no</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <enum name='secure'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>no</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    </loader>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:  </os>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:  <cpu>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    <mode name='host-passthrough' supported='yes'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <enum name='hostPassthroughMigratable'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>on</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>off</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    </mode>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    <mode name='maximum' supported='yes'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <enum name='maximumMigratable'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>on</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>off</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    </mode>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    <mode name='host-model' supported='yes'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <vendor>AMD</vendor>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <feature policy='require' name='x2apic'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <feature policy='require' name='tsc-deadline'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <feature policy='require' name='hypervisor'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <feature policy='require' name='tsc_adjust'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <feature policy='require' name='spec-ctrl'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <feature policy='require' name='stibp'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <feature policy='require' name='ssbd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <feature policy='require' name='cmp_legacy'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <feature policy='require' name='overflow-recov'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <feature policy='require' name='succor'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <feature policy='require' name='ibrs'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <feature policy='require' name='amd-ssbd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <feature policy='require' name='virt-ssbd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <feature policy='require' name='lbrv'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <feature policy='require' name='tsc-scale'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <feature policy='require' name='vmcb-clean'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <feature policy='require' name='flushbyasid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <feature policy='require' name='pause-filter'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <feature policy='require' name='pfthreshold'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <feature policy='require' name='svme-addr-chk'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <feature policy='disable' name='xsaves'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    </mode>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    <mode name='custom' supported='yes'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Broadwell'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Broadwell-IBRS'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Broadwell-noTSX'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Broadwell-v1'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Broadwell-v2'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Broadwell-v3'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Broadwell-v4'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Cascadelake-Server'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Cascadelake-Server-v1'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Cascadelake-Server-v2'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Cascadelake-Server-v3'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Cascadelake-Server-v4'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Cascadelake-Server-v5'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Cooperlake'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Cooperlake-v1'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Cooperlake-v2'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Denverton'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='mpx'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Denverton-v1'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='mpx'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Denverton-v2'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Denverton-v3'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Dhyana-v2'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='EPYC-Genoa'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='amd-psfd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='auto-ibrs'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='no-nested-data-bp'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='null-sel-clr-base'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='stibp-always-on'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='EPYC-Genoa-v1'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='amd-psfd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='auto-ibrs'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='no-nested-data-bp'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='null-sel-clr-base'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='stibp-always-on'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='EPYC-Milan'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='EPYC-Milan-v1'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='EPYC-Milan-v2'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='amd-psfd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='no-nested-data-bp'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='null-sel-clr-base'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='stibp-always-on'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='EPYC-Rome'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='EPYC-Rome-v1'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='EPYC-Rome-v2'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='EPYC-Rome-v3'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='EPYC-v3'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='EPYC-v4'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='GraniteRapids'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='amx-bf16'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='amx-fp16'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='amx-int8'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='amx-tile'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx-vnni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512-fp16'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fbsdp-no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fsrc'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fsrs'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fzrm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='mcdt-no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pbrsb-no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='prefetchiti'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='psdp-no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='serialize'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='xfd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='GraniteRapids-v1'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='amx-bf16'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='amx-fp16'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='amx-int8'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='amx-tile'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx-vnni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512-fp16'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fbsdp-no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fsrc'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fsrs'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fzrm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='mcdt-no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pbrsb-no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='prefetchiti'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='psdp-no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='serialize'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='xfd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='GraniteRapids-v2'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='amx-bf16'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='amx-fp16'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='amx-int8'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='amx-tile'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx-vnni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx10'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx10-128'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx10-256'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx10-512'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512-fp16'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='cldemote'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fbsdp-no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fsrc'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fsrs'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fzrm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='mcdt-no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='movdir64b'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='movdiri'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pbrsb-no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='prefetchiti'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='psdp-no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='serialize'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='ss'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='xfd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Haswell'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Haswell-IBRS'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Haswell-noTSX'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Haswell-v1'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Haswell-v2'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Haswell-v3'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Haswell-v4'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Icelake-Server'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Icelake-Server-noTSX'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Icelake-Server-v1'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Icelake-Server-v2'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Icelake-Server-v3'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Icelake-Server-v4'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Icelake-Server-v5'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Icelake-Server-v6'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Icelake-Server-v7'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='IvyBridge'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='IvyBridge-IBRS'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='IvyBridge-v1'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='IvyBridge-v2'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='KnightsMill'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512-4fmaps'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512-4vnniw'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512er'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512pf'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='ss'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='KnightsMill-v1'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512-4fmaps'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512-4vnniw'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512er'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512pf'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='ss'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Opteron_G4'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fma4'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='xop'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Opteron_G4-v1'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fma4'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='xop'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Opteron_G5'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fma4'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='tbm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='xop'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Opteron_G5-v1'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fma4'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='tbm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='xop'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='SapphireRapids'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='amx-bf16'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='amx-int8'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='amx-tile'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx-vnni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512-fp16'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fsrc'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fsrs'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fzrm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='serialize'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='xfd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='SapphireRapids-v1'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='amx-bf16'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='amx-int8'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='amx-tile'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx-vnni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512-fp16'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fsrc'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fsrs'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fzrm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='serialize'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='xfd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='SapphireRapids-v2'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='amx-bf16'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='amx-int8'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='amx-tile'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx-vnni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512-fp16'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fbsdp-no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fsrc'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fsrs'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fzrm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='psdp-no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='serialize'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='xfd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='SapphireRapids-v3'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='amx-bf16'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='amx-int8'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='amx-tile'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx-vnni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512-bf16'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512-fp16'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512-vpopcntdq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bitalg'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512ifma'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vbmi2'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vnni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='cldemote'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fbsdp-no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fsrc'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fsrs'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fzrm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='la57'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='movdir64b'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='movdiri'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='psdp-no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='serialize'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='ss'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='taa-no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='tsx-ldtrk'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='xfd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='SierraForest'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx-ifma'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx-ne-convert'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx-vnni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx-vnni-int8'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='cmpccxadd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fbsdp-no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fsrs'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='mcdt-no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pbrsb-no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='psdp-no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='serialize'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='SierraForest-v1'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx-ifma'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx-ne-convert'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx-vnni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx-vnni-int8'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='bus-lock-detect'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='cmpccxadd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fbsdp-no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fsrm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='fsrs'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='ibrs-all'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='mcdt-no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pbrsb-no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='psdp-no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='sbdr-ssdp-no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='serialize'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='vaes'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='vpclmulqdq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Client'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Client-IBRS'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Client-v1'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Client-v2'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Client-v3'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Client-v4'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Server'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Server-IBRS'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Server-v1'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Server-v2'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='hle'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='rtm'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Server-v3'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Server-v4'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Skylake-Server-v5'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512bw'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512cd'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512dq'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512f'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='avx512vl'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='invpcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pcid'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='pku'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Snowridge'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='cldemote'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='core-capability'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='movdir64b'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='movdiri'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='mpx'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='split-lock-detect'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Snowridge-v1'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='cldemote'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='core-capability'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='movdir64b'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='movdiri'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='mpx'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='split-lock-detect'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Snowridge-v2'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='cldemote'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='core-capability'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='movdir64b'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='movdiri'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='split-lock-detect'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Snowridge-v3'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='cldemote'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='core-capability'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='movdir64b'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='movdiri'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='split-lock-detect'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='Snowridge-v4'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='cldemote'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='erms'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='gfni'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='movdir64b'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='movdiri'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='xsaves'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='athlon'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='3dnow'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='3dnowext'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='athlon-v1'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='3dnow'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='3dnowext'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='core2duo'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='ss'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='core2duo-v1'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='ss'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='coreduo'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='ss'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='coreduo-v1'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='ss'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='n270'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='ss'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='n270-v1'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='ss'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='phenom'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='3dnow'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='3dnowext'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <blockers model='phenom-v1'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='3dnow'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <feature name='3dnowext'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </blockers>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    </mode>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:  </cpu>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:  <memoryBacking supported='yes'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    <enum name='sourceType'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <value>file</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <value>anonymous</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <value>memfd</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    </enum>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:  </memoryBacking>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:  <devices>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    <disk supported='yes'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <enum name='diskDevice'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>disk</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>cdrom</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>floppy</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>lun</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <enum name='bus'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>ide</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>fdc</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>scsi</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>virtio</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>usb</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>sata</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <enum name='model'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>virtio</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>virtio-transitional</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>virtio-non-transitional</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    </disk>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    <graphics supported='yes'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <enum name='type'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>vnc</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>egl-headless</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>dbus</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    </graphics>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    <video supported='yes'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <enum name='modelType'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>vga</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>cirrus</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>virtio</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>none</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>bochs</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>ramfb</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    </video>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    <hostdev supported='yes'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <enum name='mode'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>subsystem</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <enum name='startupPolicy'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>default</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>mandatory</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>requisite</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>optional</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <enum name='subsysType'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>usb</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>pci</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>scsi</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <enum name='capsType'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <enum name='pciBackend'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    </hostdev>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    <rng supported='yes'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <enum name='model'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>virtio</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>virtio-transitional</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>virtio-non-transitional</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <enum name='backendModel'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>random</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>egd</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>builtin</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    </rng>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    <filesystem supported='yes'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <enum name='driverType'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>path</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>handle</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>virtiofs</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    </filesystem>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    <tpm supported='yes'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <enum name='model'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>tpm-tis</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>tpm-crb</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <enum name='backendModel'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>emulator</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>external</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <enum name='backendVersion'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>2.0</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    </tpm>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    <redirdev supported='yes'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <enum name='bus'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>usb</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    </redirdev>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    <channel supported='yes'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <enum name='type'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>pty</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>unix</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    </channel>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    <crypto supported='yes'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <enum name='model'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <enum name='type'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>qemu</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <enum name='backendModel'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>builtin</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    </crypto>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    <interface supported='yes'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <enum name='backendType'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>default</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>passt</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    </interface>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    <panic supported='yes'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <enum name='model'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>isa</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>hyperv</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    </panic>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    <console supported='yes'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <enum name='type'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>null</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>vc</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>pty</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>dev</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>file</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>pipe</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>stdio</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>udp</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>tcp</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>unix</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>qemu-vdagent</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>dbus</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    </console>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:  </devices>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:  <features>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    <gic supported='no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    <vmcoreinfo supported='yes'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    <genid supported='yes'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    <backingStoreInput supported='yes'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    <backup supported='yes'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    <async-teardown supported='yes'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    <ps2 supported='yes'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    <sev supported='no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    <sgx supported='no'/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    <hyperv supported='yes'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <enum name='features'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>relaxed</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>vapic</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>spinlocks</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>vpindex</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>runtime</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>synic</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>stimer</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>reset</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>vendor_id</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>frequencies</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>reenlightenment</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>tlbflush</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>ipi</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>avic</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>emsr_bitmap</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>xmm_input</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <defaults>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <spinlocks>4095</spinlocks>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <stimer_direct>on</stimer_direct>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <tlbflush_direct>on</tlbflush_direct>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <tlbflush_extended>on</tlbflush_extended>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </defaults>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    </hyperv>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    <launchSecurity supported='yes'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      <enum name='sectype'>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:        <value>tdx</value>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:      </enum>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:    </launchSecurity>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:  </features>
Dec  5 07:34:47 np0005546954 nova_compute[187160]: </domainCapabilities>
Dec  5 07:34:47 np0005546954 nova_compute[187160]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  5 07:34:47 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.977 187164 DEBUG nova.virt.libvirt.host [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Dec  5 07:34:47 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.977 187164 INFO nova.virt.libvirt.host [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Secure Boot support detected#033[00m
Dec  5 07:34:47 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.980 187164 INFO nova.virt.libvirt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Dec  5 07:34:47 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.980 187164 INFO nova.virt.libvirt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Dec  5 07:34:47 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.990 187164 DEBUG nova.virt.libvirt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] cpu compare xml: <cpu match="exact">
Dec  5 07:34:47 np0005546954 nova_compute[187160]:  <model>Nehalem</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]: </cpu>
Dec  5 07:34:47 np0005546954 nova_compute[187160]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Dec  5 07:34:47 np0005546954 nova_compute[187160]: 2025-12-05 12:34:46.992 187164 DEBUG nova.virt.libvirt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Dec  5 07:34:47 np0005546954 nova_compute[187160]: 2025-12-05 12:34:47.010 187164 INFO nova.virt.node [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Determined node identity eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b from /var/lib/nova/compute_id#033[00m
Dec  5 07:34:47 np0005546954 nova_compute[187160]: 2025-12-05 12:34:47.029 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Verified node eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b matches my host compute-1.ctlplane.example.com _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568#033[00m
Dec  5 07:34:47 np0005546954 nova_compute[187160]: 2025-12-05 12:34:47.056 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Dec  5 07:34:47 np0005546954 nova_compute[187160]: 2025-12-05 12:34:47.123 187164 ERROR nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Could not retrieve compute node resource provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b and therefore unable to error out any instances stuck in BUILDING state. Error: Failed to retrieve allocations for resource provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider 'eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b' not found: No resource provider with uuid eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b found  ", "request_id": "req-5170fa3f-0a66-48ca-b2d3-d9ed3545dbbb"}]}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider 'eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b' not found: No resource provider with uuid eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b found  ", "request_id": "req-5170fa3f-0a66-48ca-b2d3-d9ed3545dbbb"}]}#033[00m
Dec  5 07:34:47 np0005546954 nova_compute[187160]: 2025-12-05 12:34:47.142 187164 DEBUG oslo_concurrency.lockutils [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:34:47 np0005546954 nova_compute[187160]: 2025-12-05 12:34:47.142 187164 DEBUG oslo_concurrency.lockutils [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:34:47 np0005546954 nova_compute[187160]: 2025-12-05 12:34:47.142 187164 DEBUG oslo_concurrency.lockutils [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:34:47 np0005546954 nova_compute[187160]: 2025-12-05 12:34:47.142 187164 DEBUG nova.compute.resource_tracker [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:34:47 np0005546954 nova_compute[187160]: 2025-12-05 12:34:47.300 187164 WARNING nova.virt.libvirt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:34:47 np0005546954 nova_compute[187160]: 2025-12-05 12:34:47.301 187164 DEBUG nova.compute.resource_tracker [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6179MB free_disk=73.54240417480469GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:34:47 np0005546954 nova_compute[187160]: 2025-12-05 12:34:47.301 187164 DEBUG oslo_concurrency.lockutils [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:34:47 np0005546954 nova_compute[187160]: 2025-12-05 12:34:47.301 187164 DEBUG oslo_concurrency.lockutils [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:34:47 np0005546954 nova_compute[187160]: 2025-12-05 12:34:47.456 187164 ERROR nova.compute.resource_tracker [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Skipping removal of allocations for deleted instances: Failed to retrieve allocations for resource provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider 'eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b' not found: No resource provider with uuid eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b found  ", "request_id": "req-1aa05094-572e-4868-ae4b-f1287379700a"}]}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider 'eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b' not found: No resource provider with uuid eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b found  ", "request_id": "req-1aa05094-572e-4868-ae4b-f1287379700a"}]}#033[00m
Dec  5 07:34:47 np0005546954 nova_compute[187160]: 2025-12-05 12:34:47.457 187164 DEBUG nova.compute.resource_tracker [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:34:47 np0005546954 nova_compute[187160]: 2025-12-05 12:34:47.457 187164 DEBUG nova.compute.resource_tracker [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:34:47 np0005546954 nova_compute[187160]: 2025-12-05 12:34:47.494 187164 INFO nova.scheduler.client.report [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [req-6264c8a2-ed62-4a84-a644-c4182563cde1] Created resource provider record via placement API for resource provider with UUID eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b and name compute-1.ctlplane.example.com.#033[00m
Dec  5 07:34:47 np0005546954 nova_compute[187160]: 2025-12-05 12:34:47.517 187164 DEBUG nova.virt.libvirt.host [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Dec  5 07:34:47 np0005546954 nova_compute[187160]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m
Dec  5 07:34:47 np0005546954 nova_compute[187160]: 2025-12-05 12:34:47.518 187164 INFO nova.virt.libvirt.host [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] kernel doesn't support AMD SEV#033[00m
Dec  5 07:34:47 np0005546954 nova_compute[187160]: 2025-12-05 12:34:47.519 187164 DEBUG nova.compute.provider_tree [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Updating inventory in ProviderTree for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b with inventory: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  5 07:34:47 np0005546954 nova_compute[187160]: 2025-12-05 12:34:47.519 187164 DEBUG nova.virt.libvirt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:34:47 np0005546954 nova_compute[187160]: 2025-12-05 12:34:47.521 187164 DEBUG nova.virt.libvirt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Libvirt baseline CPU <cpu>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:  <arch>x86_64</arch>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:  <model>Nehalem</model>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:  <vendor>AMD</vendor>
Dec  5 07:34:47 np0005546954 nova_compute[187160]:  <topology sockets="8" cores="1" threads="1"/>
Dec  5 07:34:47 np0005546954 nova_compute[187160]: </cpu>
Dec  5 07:34:47 np0005546954 nova_compute[187160]: _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537#033[00m
Dec  5 07:34:47 np0005546954 nova_compute[187160]: 2025-12-05 12:34:47.571 187164 DEBUG nova.scheduler.client.report [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Updated inventory for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Dec  5 07:34:47 np0005546954 nova_compute[187160]: 2025-12-05 12:34:47.571 187164 DEBUG nova.compute.provider_tree [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Updating resource provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Dec  5 07:34:47 np0005546954 nova_compute[187160]: 2025-12-05 12:34:47.571 187164 DEBUG nova.compute.provider_tree [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Updating inventory in ProviderTree for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b with inventory: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  5 07:34:47 np0005546954 nova_compute[187160]: 2025-12-05 12:34:47.726 187164 DEBUG nova.compute.provider_tree [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Updating resource provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Dec  5 07:34:47 np0005546954 nova_compute[187160]: 2025-12-05 12:34:47.768 187164 DEBUG nova.compute.resource_tracker [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:34:47 np0005546954 nova_compute[187160]: 2025-12-05 12:34:47.768 187164 DEBUG oslo_concurrency.lockutils [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.467s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:34:47 np0005546954 nova_compute[187160]: 2025-12-05 12:34:47.768 187164 DEBUG nova.service [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m
Dec  5 07:34:47 np0005546954 nova_compute[187160]: 2025-12-05 12:34:47.834 187164 DEBUG nova.service [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m
Dec  5 07:34:47 np0005546954 nova_compute[187160]: 2025-12-05 12:34:47.834 187164 DEBUG nova.servicegroup.drivers.db [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] DB_Driver: join new ServiceGroup member compute-1.ctlplane.example.com to the compute group, service = <Service: host=compute-1.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m
Dec  5 07:34:51 np0005546954 podman[187459]: 2025-12-05 12:34:51.582006655 +0000 UTC m=+0.074978601 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent)
Dec  5 07:34:51 np0005546954 systemd-logind[789]: New session 27 of user zuul.
Dec  5 07:34:51 np0005546954 systemd[1]: Started Session 27 of User zuul.
Dec  5 07:34:52 np0005546954 python3.9[187631]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 07:34:54 np0005546954 python3.9[187787]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  5 07:34:54 np0005546954 systemd[1]: Reloading.
Dec  5 07:34:54 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:34:54 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:34:55 np0005546954 python3.9[187972]: ansible-ansible.builtin.service_facts Invoked
Dec  5 07:34:55 np0005546954 network[187989]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  5 07:34:55 np0005546954 network[187990]: 'network-scripts' will be removed from distribution in near future.
Dec  5 07:34:55 np0005546954 network[187991]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  5 07:34:57 np0005546954 podman[188041]: 2025-12-05 12:34:57.795945773 +0000 UTC m=+0.137938604 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:35:01 np0005546954 python3.9[188289]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 07:35:02 np0005546954 python3.9[188442]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:35:02 np0005546954 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  5 07:35:03 np0005546954 python3.9[188595]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:35:03 np0005546954 python3.9[188747]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:35:04 np0005546954 python3.9[188899]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  5 07:35:05 np0005546954 python3.9[189051]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  5 07:35:05 np0005546954 systemd[1]: Reloading.
Dec  5 07:35:05 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:35:05 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:35:06 np0005546954 nova_compute[187160]: 2025-12-05 12:35:06.837 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:35:06 np0005546954 nova_compute[187160]: 2025-12-05 12:35:06.871 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:35:06 np0005546954 python3.9[189238]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:35:07 np0005546954 python3.9[189391]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:35:08 np0005546954 podman[189515]: 2025-12-05 12:35:08.357439577 +0000 UTC m=+0.066844857 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec  5 07:35:08 np0005546954 python3.9[189558]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 07:35:09 np0005546954 python3.9[189713]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:35:09 np0005546954 python3.9[189834]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764938108.8027256-247-197176064987749/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=a9bdb897f3979025d9a372b4beff53a09cbe0d55 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:35:10 np0005546954 python3.9[189986]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Dec  5 07:35:12 np0005546954 python3.9[190138]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Dec  5 07:35:12 np0005546954 python3.9[190291]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  5 07:35:13 np0005546954 python3.9[190449]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec  5 07:35:15 np0005546954 python3.9[190607]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:35:15 np0005546954 python3.9[190728]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764938114.8567188-383-4291189411525/.source.conf _original_basename=ceilometer.conf follow=False checksum=f74f01c63e6cdeca5458ef9aff2a1db5d6a4e4b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:35:16 np0005546954 python3.9[190878]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:35:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:35:16.926 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:35:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:35:16.927 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:35:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:35:16.927 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:35:17 np0005546954 python3.9[190999]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764938116.0897229-383-256131149087205/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:35:17 np0005546954 python3.9[191149]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:35:18 np0005546954 python3.9[191270]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764938117.3004515-383-169015404605521/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:35:19 np0005546954 python3.9[191420]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 07:35:19 np0005546954 python3.9[191572]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 07:35:20 np0005546954 python3.9[191724]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:35:21 np0005546954 python3.9[191845]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764938120.141848-501-61449915163124/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:35:21 np0005546954 podman[191969]: 2025-12-05 12:35:21.851623815 +0000 UTC m=+0.069682635 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  5 07:35:22 np0005546954 python3.9[192008]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:35:22 np0005546954 python3.9[192090]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:35:23 np0005546954 python3.9[192240]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:35:23 np0005546954 python3.9[192361]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764938122.6942704-501-27015986337921/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=17453a32c9d181134878b3e453cb84c3cd9bd67d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:35:24 np0005546954 python3.9[192511]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:35:25 np0005546954 python3.9[192632]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764938123.974408-501-219499554189511/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:35:25 np0005546954 python3.9[192782]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:35:26 np0005546954 python3.9[192903]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764938125.276341-501-248516557426713/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:35:27 np0005546954 python3.9[193053]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:35:27 np0005546954 python3.9[193174]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764938126.6931918-501-12444878650865/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=6e4982940d2bfae88404914dfaf72552f6356d81 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:35:28 np0005546954 podman[193298]: 2025-12-05 12:35:28.29424009 +0000 UTC m=+0.122304845 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec  5 07:35:28 np0005546954 python3.9[193337]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:35:28 np0005546954 python3.9[193472]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764938127.8767495-501-16986424638873/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:35:29 np0005546954 python3.9[193622]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:35:30 np0005546954 python3.9[193743]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764938129.1506653-501-122219887765447/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=d474f1e4c3dbd24762592c51cbe5311f0a037273 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:35:31 np0005546954 python3.9[193893]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:35:31 np0005546954 python3.9[194014]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764938130.4509122-501-55235733376444/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=2b6bd0891e609bf38a73282f42888052b750bed6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:35:32 np0005546954 python3.9[194164]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:35:32 np0005546954 python3.9[194285]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764938131.8083847-501-50333056288378/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=e342121a88f67e2bae7ebc05d1e6d350470198a5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:35:33 np0005546954 python3.9[194435]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:35:34 np0005546954 python3.9[194556]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764938133.058281-501-218678995836311/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:35:34 np0005546954 python3.9[194706]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:35:35 np0005546954 python3.9[194782]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/node_exporter.yaml _original_basename=node_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/node_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:35:36 np0005546954 python3.9[194932]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:35:36 np0005546954 python3.9[195008]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml _original_basename=podman_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/podman_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:35:37 np0005546954 python3.9[195158]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:35:37 np0005546954 python3.9[195234]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml _original_basename=ceilometer_prom_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:35:38 np0005546954 podman[195358]: 2025-12-05 12:35:38.497153683 +0000 UTC m=+0.079061960 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd)
Dec  5 07:35:38 np0005546954 python3.9[195404]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:35:39 np0005546954 python3.9[195559]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:35:40 np0005546954 python3.9[195711]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:35:41 np0005546954 python3.9[195863]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 07:35:41 np0005546954 systemd[1]: Reloading.
Dec  5 07:35:41 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:35:41 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:35:41 np0005546954 systemd[1]: Listening on Podman API Socket.
Dec  5 07:35:42 np0005546954 python3.9[196054]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:35:43 np0005546954 python3.9[196177]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764938142.100189-945-80744552252588/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:35:44 np0005546954 python3.9[196329]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Dec  5 07:35:45 np0005546954 python3.9[196481]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  5 07:35:46 np0005546954 nova_compute[187160]: 2025-12-05 12:35:46.042 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:35:46 np0005546954 nova_compute[187160]: 2025-12-05 12:35:46.045 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:35:46 np0005546954 nova_compute[187160]: 2025-12-05 12:35:46.045 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:35:46 np0005546954 nova_compute[187160]: 2025-12-05 12:35:46.046 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:35:46 np0005546954 nova_compute[187160]: 2025-12-05 12:35:46.223 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 07:35:46 np0005546954 nova_compute[187160]: 2025-12-05 12:35:46.224 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:35:46 np0005546954 nova_compute[187160]: 2025-12-05 12:35:46.225 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:35:46 np0005546954 nova_compute[187160]: 2025-12-05 12:35:46.225 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:35:46 np0005546954 nova_compute[187160]: 2025-12-05 12:35:46.226 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:35:46 np0005546954 nova_compute[187160]: 2025-12-05 12:35:46.226 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:35:46 np0005546954 nova_compute[187160]: 2025-12-05 12:35:46.227 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:35:46 np0005546954 nova_compute[187160]: 2025-12-05 12:35:46.227 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:35:46 np0005546954 nova_compute[187160]: 2025-12-05 12:35:46.227 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:35:46 np0005546954 nova_compute[187160]: 2025-12-05 12:35:46.264 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:35:46 np0005546954 nova_compute[187160]: 2025-12-05 12:35:46.265 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:35:46 np0005546954 nova_compute[187160]: 2025-12-05 12:35:46.266 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:35:46 np0005546954 nova_compute[187160]: 2025-12-05 12:35:46.266 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:35:46 np0005546954 nova_compute[187160]: 2025-12-05 12:35:46.486 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:35:46 np0005546954 nova_compute[187160]: 2025-12-05 12:35:46.488 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6164MB free_disk=73.5423812866211GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:35:46 np0005546954 nova_compute[187160]: 2025-12-05 12:35:46.488 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:35:46 np0005546954 nova_compute[187160]: 2025-12-05 12:35:46.488 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:35:46 np0005546954 nova_compute[187160]: 2025-12-05 12:35:46.556 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:35:46 np0005546954 nova_compute[187160]: 2025-12-05 12:35:46.556 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:35:46 np0005546954 nova_compute[187160]: 2025-12-05 12:35:46.577 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:35:46 np0005546954 nova_compute[187160]: 2025-12-05 12:35:46.590 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:35:46 np0005546954 nova_compute[187160]: 2025-12-05 12:35:46.591 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:35:46 np0005546954 nova_compute[187160]: 2025-12-05 12:35:46.592 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:35:46 np0005546954 python3[196633]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Dec  5 07:35:48 np0005546954 podman[196646]: 2025-12-05 12:35:48.093015187 +0000 UTC m=+1.257503297 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Dec  5 07:35:48 np0005546954 podman[196746]: 2025-12-05 12:35:48.275352303 +0000 UTC m=+0.067664352 container create 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=edpm, container_name=podman_exporter)
Dec  5 07:35:48 np0005546954 podman[196746]: 2025-12-05 12:35:48.245620491 +0000 UTC m=+0.037932560 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Dec  5 07:35:48 np0005546954 python3[196633]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Dec  5 07:35:49 np0005546954 python3.9[196936]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 07:35:50 np0005546954 python3.9[197090]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:35:51 np0005546954 python3.9[197241]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764938150.9292297-1051-3715156996540/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:35:51 np0005546954 podman[197289]: 2025-12-05 12:35:51.978484233 +0000 UTC m=+0.065332218 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125)
Dec  5 07:35:52 np0005546954 python3.9[197336]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  5 07:35:52 np0005546954 systemd[1]: Reloading.
Dec  5 07:35:52 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:35:52 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:35:53 np0005546954 python3.9[197446]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 07:35:53 np0005546954 systemd[1]: Reloading.
Dec  5 07:35:53 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:35:53 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:35:53 np0005546954 systemd[1]: Starting podman_exporter container...
Dec  5 07:35:53 np0005546954 systemd[1]: Started libcrun container.
Dec  5 07:35:53 np0005546954 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/751a42bef7ee2faf57c53690a21b78cbcc55032fe9d1cf76a519a405a97b924a/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec  5 07:35:53 np0005546954 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/751a42bef7ee2faf57c53690a21b78cbcc55032fe9d1cf76a519a405a97b924a/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec  5 07:35:53 np0005546954 systemd[1]: Started /usr/bin/podman healthcheck run 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3.
Dec  5 07:35:53 np0005546954 podman[197486]: 2025-12-05 12:35:53.838264835 +0000 UTC m=+0.164274011 container init 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  5 07:35:53 np0005546954 podman_exporter[197502]: ts=2025-12-05T12:35:53.857Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Dec  5 07:35:53 np0005546954 podman_exporter[197502]: ts=2025-12-05T12:35:53.857Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Dec  5 07:35:53 np0005546954 podman_exporter[197502]: ts=2025-12-05T12:35:53.857Z caller=handler.go:94 level=info msg="enabled collectors"
Dec  5 07:35:53 np0005546954 podman_exporter[197502]: ts=2025-12-05T12:35:53.857Z caller=handler.go:105 level=info collector=container
Dec  5 07:35:53 np0005546954 podman[197486]: 2025-12-05 12:35:53.874873813 +0000 UTC m=+0.200882989 container start 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 07:35:53 np0005546954 podman[197486]: podman_exporter
Dec  5 07:35:53 np0005546954 systemd[1]: Starting Podman API Service...
Dec  5 07:35:53 np0005546954 systemd[1]: Started podman_exporter container.
Dec  5 07:35:53 np0005546954 systemd[1]: Started Podman API Service.
Dec  5 07:35:53 np0005546954 podman[197513]: time="2025-12-05T12:35:53Z" level=info msg="/usr/bin/podman filtering at log level info"
Dec  5 07:35:53 np0005546954 podman[197513]: time="2025-12-05T12:35:53Z" level=info msg="Setting parallel job count to 25"
Dec  5 07:35:53 np0005546954 podman[197513]: time="2025-12-05T12:35:53Z" level=info msg="Using sqlite as database backend"
Dec  5 07:35:53 np0005546954 podman[197513]: time="2025-12-05T12:35:53Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Dec  5 07:35:53 np0005546954 podman[197513]: time="2025-12-05T12:35:53Z" level=info msg="Using systemd socket activation to determine API endpoint"
Dec  5 07:35:53 np0005546954 podman[197513]: time="2025-12-05T12:35:53Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Dec  5 07:35:53 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:35:53 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Dec  5 07:35:53 np0005546954 podman[197513]: time="2025-12-05T12:35:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:35:53 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:35:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 14065 "" "Go-http-client/1.1"
Dec  5 07:35:53 np0005546954 podman_exporter[197502]: ts=2025-12-05T12:35:53.970Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Dec  5 07:35:53 np0005546954 podman_exporter[197502]: ts=2025-12-05T12:35:53.971Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Dec  5 07:35:53 np0005546954 podman_exporter[197502]: ts=2025-12-05T12:35:53.972Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Dec  5 07:35:53 np0005546954 podman[197512]: 2025-12-05 12:35:53.976317144 +0000 UTC m=+0.089229529 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec  5 07:35:53 np0005546954 systemd[1]: 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3-4f93a4fd1de62212.service: Main process exited, code=exited, status=1/FAILURE
Dec  5 07:35:53 np0005546954 systemd[1]: 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3-4f93a4fd1de62212.service: Failed with result 'exit-code'.
Dec  5 07:35:55 np0005546954 python3.9[197700]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  5 07:35:55 np0005546954 systemd[1]: Stopping podman_exporter container...
Dec  5 07:35:55 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:35:53 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 1449 "" "Go-http-client/1.1"
Dec  5 07:35:55 np0005546954 systemd[1]: libpod-53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3.scope: Deactivated successfully.
Dec  5 07:35:55 np0005546954 podman[197704]: 2025-12-05 12:35:55.292559174 +0000 UTC m=+0.116916678 container died 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 07:35:55 np0005546954 systemd[1]: 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3-4f93a4fd1de62212.timer: Deactivated successfully.
Dec  5 07:35:55 np0005546954 systemd[1]: Stopped /usr/bin/podman healthcheck run 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3.
Dec  5 07:35:55 np0005546954 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3-userdata-shm.mount: Deactivated successfully.
Dec  5 07:35:55 np0005546954 systemd[1]: var-lib-containers-storage-overlay-751a42bef7ee2faf57c53690a21b78cbcc55032fe9d1cf76a519a405a97b924a-merged.mount: Deactivated successfully.
Dec  5 07:35:55 np0005546954 podman[197704]: 2025-12-05 12:35:55.913963358 +0000 UTC m=+0.738320882 container cleanup 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  5 07:35:55 np0005546954 podman[197704]: podman_exporter
Dec  5 07:35:55 np0005546954 systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Dec  5 07:35:55 np0005546954 podman[197731]: podman_exporter
Dec  5 07:35:55 np0005546954 systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Dec  5 07:35:55 np0005546954 systemd[1]: Stopped podman_exporter container.
Dec  5 07:35:56 np0005546954 systemd[1]: Starting podman_exporter container...
Dec  5 07:35:56 np0005546954 systemd[1]: Started libcrun container.
Dec  5 07:35:56 np0005546954 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/751a42bef7ee2faf57c53690a21b78cbcc55032fe9d1cf76a519a405a97b924a/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec  5 07:35:56 np0005546954 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/751a42bef7ee2faf57c53690a21b78cbcc55032fe9d1cf76a519a405a97b924a/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec  5 07:35:56 np0005546954 systemd[1]: Started /usr/bin/podman healthcheck run 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3.
Dec  5 07:35:56 np0005546954 podman[197744]: 2025-12-05 12:35:56.160515158 +0000 UTC m=+0.133757505 container init 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 07:35:56 np0005546954 podman_exporter[197759]: ts=2025-12-05T12:35:56.188Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Dec  5 07:35:56 np0005546954 podman_exporter[197759]: ts=2025-12-05T12:35:56.188Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Dec  5 07:35:56 np0005546954 podman_exporter[197759]: ts=2025-12-05T12:35:56.188Z caller=handler.go:94 level=info msg="enabled collectors"
Dec  5 07:35:56 np0005546954 podman_exporter[197759]: ts=2025-12-05T12:35:56.188Z caller=handler.go:105 level=info collector=container
Dec  5 07:35:56 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:35:56 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Dec  5 07:35:56 np0005546954 podman[197513]: time="2025-12-05T12:35:56Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:35:56 np0005546954 podman[197744]: 2025-12-05 12:35:56.192234912 +0000 UTC m=+0.165477279 container start 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec  5 07:35:56 np0005546954 podman[197744]: podman_exporter
Dec  5 07:35:56 np0005546954 systemd[1]: Started podman_exporter container.
Dec  5 07:35:56 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:35:56 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 14067 "" "Go-http-client/1.1"
Dec  5 07:35:56 np0005546954 podman_exporter[197759]: ts=2025-12-05T12:35:56.214Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Dec  5 07:35:56 np0005546954 podman_exporter[197759]: ts=2025-12-05T12:35:56.215Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Dec  5 07:35:56 np0005546954 podman_exporter[197759]: ts=2025-12-05T12:35:56.215Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Dec  5 07:35:56 np0005546954 podman[197768]: 2025-12-05 12:35:56.269174715 +0000 UTC m=+0.062725148 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 07:35:56 np0005546954 python3.9[197944]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:35:57 np0005546954 python3.9[198067]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764938156.420936-1115-218017045014105/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  5 07:35:58 np0005546954 python3.9[198219]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Dec  5 07:35:58 np0005546954 podman[198220]: 2025-12-05 12:35:58.597532208 +0000 UTC m=+0.108466711 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  5 07:35:59 np0005546954 python3.9[198397]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  5 07:36:00 np0005546954 python3[198549]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Dec  5 07:36:02 np0005546954 podman[198560]: 2025-12-05 12:36:02.512692876 +0000 UTC m=+2.212210924 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Dec  5 07:36:02 np0005546954 podman[198657]: 2025-12-05 12:36:02.658318602 +0000 UTC m=+0.049847935 container create 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.buildah.version=1.33.7, name=ubi9-minimal, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, release=1755695350, config_id=edpm, architecture=x86_64)
Dec  5 07:36:02 np0005546954 podman[198657]: 2025-12-05 12:36:02.633578005 +0000 UTC m=+0.025107348 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Dec  5 07:36:02 np0005546954 python3[198549]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Dec  5 07:36:03 np0005546954 python3.9[198848]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 07:36:04 np0005546954 python3.9[199002]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:36:05 np0005546954 python3.9[199153]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764938164.406385-1221-65998516423886/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:36:05 np0005546954 python3.9[199229]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  5 07:36:05 np0005546954 systemd[1]: Reloading.
Dec  5 07:36:05 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:36:05 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:36:06 np0005546954 python3.9[199340]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 07:36:06 np0005546954 systemd[1]: Reloading.
Dec  5 07:36:06 np0005546954 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 07:36:06 np0005546954 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 07:36:07 np0005546954 systemd[1]: Starting openstack_network_exporter container...
Dec  5 07:36:07 np0005546954 systemd[1]: Started libcrun container.
Dec  5 07:36:07 np0005546954 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/26338664221c21be6dba6bbf107d870cb9fcafae87eaade5253d7ca009e87b1b/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec  5 07:36:07 np0005546954 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/26338664221c21be6dba6bbf107d870cb9fcafae87eaade5253d7ca009e87b1b/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec  5 07:36:07 np0005546954 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/26338664221c21be6dba6bbf107d870cb9fcafae87eaade5253d7ca009e87b1b/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec  5 07:36:07 np0005546954 systemd[1]: Started /usr/bin/podman healthcheck run 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601.
Dec  5 07:36:07 np0005546954 podman[199382]: 2025-12-05 12:36:07.198931049 +0000 UTC m=+0.127924853 container init 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vcs-type=git, architecture=x86_64, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, release=1755695350, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.buildah.version=1.33.7)
Dec  5 07:36:07 np0005546954 openstack_network_exporter[199397]: INFO    12:36:07 main.go:48: registering *bridge.Collector
Dec  5 07:36:07 np0005546954 openstack_network_exporter[199397]: INFO    12:36:07 main.go:48: registering *coverage.Collector
Dec  5 07:36:07 np0005546954 openstack_network_exporter[199397]: INFO    12:36:07 main.go:48: registering *datapath.Collector
Dec  5 07:36:07 np0005546954 openstack_network_exporter[199397]: INFO    12:36:07 main.go:48: registering *iface.Collector
Dec  5 07:36:07 np0005546954 openstack_network_exporter[199397]: INFO    12:36:07 main.go:48: registering *memory.Collector
Dec  5 07:36:07 np0005546954 openstack_network_exporter[199397]: INFO    12:36:07 main.go:48: registering *ovnnorthd.Collector
Dec  5 07:36:07 np0005546954 openstack_network_exporter[199397]: INFO    12:36:07 main.go:48: registering *ovn.Collector
Dec  5 07:36:07 np0005546954 openstack_network_exporter[199397]: INFO    12:36:07 main.go:48: registering *ovsdbserver.Collector
Dec  5 07:36:07 np0005546954 openstack_network_exporter[199397]: INFO    12:36:07 main.go:48: registering *pmd_perf.Collector
Dec  5 07:36:07 np0005546954 openstack_network_exporter[199397]: INFO    12:36:07 main.go:48: registering *pmd_rxq.Collector
Dec  5 07:36:07 np0005546954 openstack_network_exporter[199397]: INFO    12:36:07 main.go:48: registering *vswitch.Collector
Dec  5 07:36:07 np0005546954 openstack_network_exporter[199397]: NOTICE  12:36:07 main.go:76: listening on https://:9105/metrics
Dec  5 07:36:07 np0005546954 podman[199382]: 2025-12-05 12:36:07.224877042 +0000 UTC m=+0.153870836 container start 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git)
Dec  5 07:36:07 np0005546954 podman[199382]: openstack_network_exporter
Dec  5 07:36:07 np0005546954 systemd[1]: Started openstack_network_exporter container.
Dec  5 07:36:07 np0005546954 podman[199407]: 2025-12-05 12:36:07.326174778 +0000 UTC m=+0.088400122 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, managed_by=edpm_ansible, release=1755695350, vcs-type=git, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, container_name=openstack_network_exporter, vendor=Red Hat, Inc.)
Dec  5 07:36:08 np0005546954 python3.9[199582]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  5 07:36:08 np0005546954 systemd[1]: Stopping openstack_network_exporter container...
Dec  5 07:36:08 np0005546954 systemd[1]: libpod-02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601.scope: Deactivated successfully.
Dec  5 07:36:08 np0005546954 podman[199586]: 2025-12-05 12:36:08.309467969 +0000 UTC m=+0.074505777 container died 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., managed_by=edpm_ansible, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec  5 07:36:08 np0005546954 systemd[1]: 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601-5ebfeab4f23736f2.timer: Deactivated successfully.
Dec  5 07:36:08 np0005546954 systemd[1]: Stopped /usr/bin/podman healthcheck run 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601.
Dec  5 07:36:08 np0005546954 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601-userdata-shm.mount: Deactivated successfully.
Dec  5 07:36:08 np0005546954 systemd[1]: var-lib-containers-storage-overlay-26338664221c21be6dba6bbf107d870cb9fcafae87eaade5253d7ca009e87b1b-merged.mount: Deactivated successfully.
Dec  5 07:36:09 np0005546954 podman[199586]: 2025-12-05 12:36:09.553217336 +0000 UTC m=+1.318255144 container cleanup 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, version=9.6, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.expose-services=, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers)
Dec  5 07:36:09 np0005546954 podman[199586]: openstack_network_exporter
Dec  5 07:36:09 np0005546954 podman[199613]: 2025-12-05 12:36:09.555765665 +0000 UTC m=+0.068864799 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  5 07:36:09 np0005546954 systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Dec  5 07:36:09 np0005546954 podman[199633]: openstack_network_exporter
Dec  5 07:36:09 np0005546954 systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Dec  5 07:36:09 np0005546954 systemd[1]: Stopped openstack_network_exporter container.
Dec  5 07:36:09 np0005546954 systemd[1]: Starting openstack_network_exporter container...
Dec  5 07:36:09 np0005546954 systemd[1]: Started libcrun container.
Dec  5 07:36:09 np0005546954 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/26338664221c21be6dba6bbf107d870cb9fcafae87eaade5253d7ca009e87b1b/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec  5 07:36:09 np0005546954 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/26338664221c21be6dba6bbf107d870cb9fcafae87eaade5253d7ca009e87b1b/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec  5 07:36:09 np0005546954 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/26338664221c21be6dba6bbf107d870cb9fcafae87eaade5253d7ca009e87b1b/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec  5 07:36:09 np0005546954 systemd[1]: Started /usr/bin/podman healthcheck run 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601.
Dec  5 07:36:09 np0005546954 podman[199646]: 2025-12-05 12:36:09.787685187 +0000 UTC m=+0.131269837 container init 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, architecture=x86_64, build-date=2025-08-20T13:12:41, release=1755695350, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, com.redhat.component=ubi9-minimal-container)
Dec  5 07:36:09 np0005546954 openstack_network_exporter[199661]: INFO    12:36:09 main.go:48: registering *bridge.Collector
Dec  5 07:36:09 np0005546954 openstack_network_exporter[199661]: INFO    12:36:09 main.go:48: registering *coverage.Collector
Dec  5 07:36:09 np0005546954 openstack_network_exporter[199661]: INFO    12:36:09 main.go:48: registering *datapath.Collector
Dec  5 07:36:09 np0005546954 openstack_network_exporter[199661]: INFO    12:36:09 main.go:48: registering *iface.Collector
Dec  5 07:36:09 np0005546954 openstack_network_exporter[199661]: INFO    12:36:09 main.go:48: registering *memory.Collector
Dec  5 07:36:09 np0005546954 openstack_network_exporter[199661]: INFO    12:36:09 main.go:48: registering *ovnnorthd.Collector
Dec  5 07:36:09 np0005546954 openstack_network_exporter[199661]: INFO    12:36:09 main.go:48: registering *ovn.Collector
Dec  5 07:36:09 np0005546954 openstack_network_exporter[199661]: INFO    12:36:09 main.go:48: registering *ovsdbserver.Collector
Dec  5 07:36:09 np0005546954 openstack_network_exporter[199661]: INFO    12:36:09 main.go:48: registering *pmd_perf.Collector
Dec  5 07:36:09 np0005546954 openstack_network_exporter[199661]: INFO    12:36:09 main.go:48: registering *pmd_rxq.Collector
Dec  5 07:36:09 np0005546954 openstack_network_exporter[199661]: INFO    12:36:09 main.go:48: registering *vswitch.Collector
Dec  5 07:36:09 np0005546954 openstack_network_exporter[199661]: NOTICE  12:36:09 main.go:76: listening on https://:9105/metrics
Dec  5 07:36:09 np0005546954 podman[199646]: 2025-12-05 12:36:09.816359306 +0000 UTC m=+0.159943936 container start 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, version=9.6, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9-minimal, architecture=x86_64, managed_by=edpm_ansible, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7)
Dec  5 07:36:09 np0005546954 podman[199646]: openstack_network_exporter
Dec  5 07:36:09 np0005546954 systemd[1]: Started openstack_network_exporter container.
Dec  5 07:36:09 np0005546954 podman[199670]: 2025-12-05 12:36:09.89113439 +0000 UTC m=+0.064114571 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vcs-type=git, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec  5 07:36:10 np0005546954 python3.9[199843]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  5 07:36:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:36:16.927 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:36:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:36:16.930 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:36:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:36:16.930 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:36:22 np0005546954 podman[199868]: 2025-12-05 12:36:22.570847313 +0000 UTC m=+0.078465931 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  5 07:36:26 np0005546954 podman[199887]: 2025-12-05 12:36:26.56223634 +0000 UTC m=+0.070044677 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 07:36:28 np0005546954 python3.9[200038]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Dec  5 07:36:29 np0005546954 podman[200176]: 2025-12-05 12:36:29.117610992 +0000 UTC m=+0.116430491 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  5 07:36:29 np0005546954 python3.9[200220]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec  5 07:36:29 np0005546954 systemd[1]: Started libpod-conmon-0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc.scope.
Dec  5 07:36:29 np0005546954 podman[200231]: 2025-12-05 12:36:29.392972366 +0000 UTC m=+0.087314939 container exec 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:36:29 np0005546954 podman[200231]: 2025-12-05 12:36:29.428698276 +0000 UTC m=+0.123040869 container exec_died 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Dec  5 07:36:29 np0005546954 systemd[1]: libpod-conmon-0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc.scope: Deactivated successfully.
Dec  5 07:36:30 np0005546954 python3.9[200415]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec  5 07:36:30 np0005546954 systemd[1]: Started libpod-conmon-0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc.scope.
Dec  5 07:36:30 np0005546954 podman[200416]: 2025-12-05 12:36:30.357913501 +0000 UTC m=+0.094295298 container exec 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  5 07:36:30 np0005546954 podman[200416]: 2025-12-05 12:36:30.363675001 +0000 UTC m=+0.100056758 container exec_died 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec  5 07:36:30 np0005546954 systemd[1]: libpod-conmon-0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc.scope: Deactivated successfully.
Dec  5 07:36:31 np0005546954 python3.9[200600]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:36:31 np0005546954 python3.9[200752]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Dec  5 07:36:32 np0005546954 python3.9[200917]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec  5 07:36:32 np0005546954 systemd[1]: Started libpod-conmon-cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806.scope.
Dec  5 07:36:32 np0005546954 podman[200918]: 2025-12-05 12:36:32.739168644 +0000 UTC m=+0.085747651 container exec cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec  5 07:36:32 np0005546954 podman[200918]: 2025-12-05 12:36:32.776683099 +0000 UTC m=+0.123262076 container exec_died cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec  5 07:36:32 np0005546954 systemd[1]: libpod-conmon-cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806.scope: Deactivated successfully.
Dec  5 07:36:33 np0005546954 python3.9[201102]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec  5 07:36:33 np0005546954 systemd[1]: Started libpod-conmon-cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806.scope.
Dec  5 07:36:33 np0005546954 podman[201103]: 2025-12-05 12:36:33.555174548 +0000 UTC m=+0.083840219 container exec cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec  5 07:36:33 np0005546954 podman[201103]: 2025-12-05 12:36:33.584460007 +0000 UTC m=+0.113125678 container exec_died cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec  5 07:36:33 np0005546954 systemd[1]: libpod-conmon-cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806.scope: Deactivated successfully.
Dec  5 07:36:34 np0005546954 python3.9[201287]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:36:35 np0005546954 python3.9[201439]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Dec  5 07:36:35 np0005546954 python3.9[201604]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec  5 07:36:36 np0005546954 systemd[1]: Started libpod-conmon-bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d.scope.
Dec  5 07:36:36 np0005546954 podman[201605]: 2025-12-05 12:36:36.032938537 +0000 UTC m=+0.071410800 container exec bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:36:36 np0005546954 podman[201605]: 2025-12-05 12:36:36.069372189 +0000 UTC m=+0.107844432 container exec_died bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  5 07:36:36 np0005546954 systemd[1]: libpod-conmon-bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d.scope: Deactivated successfully.
Dec  5 07:36:36 np0005546954 python3.9[201786]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec  5 07:36:36 np0005546954 systemd[1]: Started libpod-conmon-bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d.scope.
Dec  5 07:36:36 np0005546954 podman[201787]: 2025-12-05 12:36:36.920065342 +0000 UTC m=+0.083679825 container exec bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec  5 07:36:36 np0005546954 podman[201787]: 2025-12-05 12:36:36.954625496 +0000 UTC m=+0.118239949 container exec_died bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:36:36 np0005546954 systemd[1]: libpod-conmon-bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d.scope: Deactivated successfully.
Dec  5 07:36:37 np0005546954 python3.9[201970]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:36:38 np0005546954 python3.9[202122]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Dec  5 07:36:39 np0005546954 python3.9[202287]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec  5 07:36:39 np0005546954 systemd[1]: Started libpod-conmon-53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3.scope.
Dec  5 07:36:39 np0005546954 podman[202288]: 2025-12-05 12:36:39.322371245 +0000 UTC m=+0.078679809 container exec 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 07:36:39 np0005546954 podman[202288]: 2025-12-05 12:36:39.355510304 +0000 UTC m=+0.111818848 container exec_died 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 07:36:39 np0005546954 systemd[1]: libpod-conmon-53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3.scope: Deactivated successfully.
Dec  5 07:36:39 np0005546954 auditd[699]: Audit daemon rotating log files
Dec  5 07:36:39 np0005546954 podman[202443]: 2025-12-05 12:36:39.920969304 +0000 UTC m=+0.090662184 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  5 07:36:40 np0005546954 podman[202489]: 2025-12-05 12:36:40.025456379 +0000 UTC m=+0.076971534 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.openshift.expose-services=, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.tags=minimal rhel9)
Dec  5 07:36:40 np0005546954 python3.9[202492]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec  5 07:36:40 np0005546954 systemd[1]: Started libpod-conmon-53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3.scope.
Dec  5 07:36:40 np0005546954 podman[202514]: 2025-12-05 12:36:40.275454118 +0000 UTC m=+0.087152294 container exec 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 07:36:40 np0005546954 podman[202514]: 2025-12-05 12:36:40.309927219 +0000 UTC m=+0.121625335 container exec_died 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  5 07:36:40 np0005546954 systemd[1]: libpod-conmon-53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3.scope: Deactivated successfully.
Dec  5 07:36:41 np0005546954 python3.9[202698]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:36:41 np0005546954 python3.9[202850]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Dec  5 07:36:42 np0005546954 python3.9[203015]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec  5 07:36:42 np0005546954 systemd[1]: Started libpod-conmon-02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601.scope.
Dec  5 07:36:42 np0005546954 podman[203016]: 2025-12-05 12:36:42.906010697 +0000 UTC m=+0.099853821 container exec 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, release=1755695350, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec  5 07:36:42 np0005546954 podman[203016]: 2025-12-05 12:36:42.944893196 +0000 UTC m=+0.138736330 container exec_died 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, version=9.6, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, name=ubi9-minimal, distribution-scope=public, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec  5 07:36:42 np0005546954 systemd[1]: libpod-conmon-02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601.scope: Deactivated successfully.
Dec  5 07:36:43 np0005546954 python3.9[203200]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec  5 07:36:43 np0005546954 systemd[1]: Started libpod-conmon-02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601.scope.
Dec  5 07:36:43 np0005546954 podman[203201]: 2025-12-05 12:36:43.866228624 +0000 UTC m=+0.088395122 container exec 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9)
Dec  5 07:36:43 np0005546954 podman[203201]: 2025-12-05 12:36:43.896829503 +0000 UTC m=+0.118995981 container exec_died 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, distribution-scope=public, release=1755695350, vcs-type=git, config_id=edpm, version=9.6, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec  5 07:36:43 np0005546954 systemd[1]: libpod-conmon-02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601.scope: Deactivated successfully.
Dec  5 07:36:44 np0005546954 python3.9[203384]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:36:45 np0005546954 python3.9[203536]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:36:46 np0005546954 python3.9[203688]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:36:46 np0005546954 nova_compute[187160]: 2025-12-05 12:36:46.581 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:36:46 np0005546954 nova_compute[187160]: 2025-12-05 12:36:46.582 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:36:46 np0005546954 nova_compute[187160]: 2025-12-05 12:36:46.606 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:36:46 np0005546954 nova_compute[187160]: 2025-12-05 12:36:46.607 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:36:46 np0005546954 nova_compute[187160]: 2025-12-05 12:36:46.607 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:36:46 np0005546954 nova_compute[187160]: 2025-12-05 12:36:46.620 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 07:36:46 np0005546954 nova_compute[187160]: 2025-12-05 12:36:46.620 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:36:46 np0005546954 nova_compute[187160]: 2025-12-05 12:36:46.620 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:36:46 np0005546954 nova_compute[187160]: 2025-12-05 12:36:46.621 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:36:46 np0005546954 nova_compute[187160]: 2025-12-05 12:36:46.621 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:36:46 np0005546954 nova_compute[187160]: 2025-12-05 12:36:46.657 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:36:46 np0005546954 nova_compute[187160]: 2025-12-05 12:36:46.657 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:36:46 np0005546954 nova_compute[187160]: 2025-12-05 12:36:46.658 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:36:46 np0005546954 nova_compute[187160]: 2025-12-05 12:36:46.658 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:36:46 np0005546954 nova_compute[187160]: 2025-12-05 12:36:46.836 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:36:46 np0005546954 nova_compute[187160]: 2025-12-05 12:36:46.837 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6032MB free_disk=73.374267578125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:36:46 np0005546954 nova_compute[187160]: 2025-12-05 12:36:46.837 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:36:46 np0005546954 nova_compute[187160]: 2025-12-05 12:36:46.838 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:36:46 np0005546954 nova_compute[187160]: 2025-12-05 12:36:46.911 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:36:46 np0005546954 nova_compute[187160]: 2025-12-05 12:36:46.911 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:36:46 np0005546954 python3.9[203811]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764938205.763606-1651-35545217879152/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:36:46 np0005546954 nova_compute[187160]: 2025-12-05 12:36:46.930 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:36:46 np0005546954 nova_compute[187160]: 2025-12-05 12:36:46.944 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:36:46 np0005546954 nova_compute[187160]: 2025-12-05 12:36:46.946 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:36:46 np0005546954 nova_compute[187160]: 2025-12-05 12:36:46.946 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:36:47 np0005546954 nova_compute[187160]: 2025-12-05 12:36:47.364 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:36:47 np0005546954 nova_compute[187160]: 2025-12-05 12:36:47.365 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:36:47 np0005546954 python3.9[203963]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:36:48 np0005546954 nova_compute[187160]: 2025-12-05 12:36:48.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:36:48 np0005546954 nova_compute[187160]: 2025-12-05 12:36:48.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:36:48 np0005546954 python3.9[204115]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:36:48 np0005546954 python3.9[204193]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:36:49 np0005546954 python3.9[204345]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:36:50 np0005546954 python3.9[204423]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.i0cys_o2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:36:50 np0005546954 python3.9[204575]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:36:51 np0005546954 python3.9[204653]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:36:52 np0005546954 python3.9[204805]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:36:52 np0005546954 podman[204930]: 2025-12-05 12:36:52.912654898 +0000 UTC m=+0.074176977 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Dec  5 07:36:53 np0005546954 python3[204969]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec  5 07:36:53 np0005546954 python3.9[205129]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:36:54 np0005546954 python3.9[205207]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:36:55 np0005546954 python3.9[205359]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:36:55 np0005546954 python3.9[205437]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:36:56 np0005546954 python3.9[205589]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:36:56 np0005546954 python3.9[205667]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:36:57 np0005546954 podman[205791]: 2025-12-05 12:36:57.284883639 +0000 UTC m=+0.064056488 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec  5 07:36:57 np0005546954 python3.9[205843]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:36:57 np0005546954 python3.9[205921]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:36:58 np0005546954 python3.9[206073]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 07:36:59 np0005546954 podman[206198]: 2025-12-05 12:36:59.297250622 +0000 UTC m=+0.131189221 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  5 07:36:59 np0005546954 python3.9[206199]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764938218.1507952-1901-266514921557592/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:37:00 np0005546954 python3.9[206377]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:37:00 np0005546954 python3.9[206529]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:37:01 np0005546954 python3.9[206684]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:37:02 np0005546954 python3.9[206836]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:37:03 np0005546954 python3.9[206989]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 07:37:03 np0005546954 python3.9[207143]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 07:37:04 np0005546954 python3.9[207298]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 07:37:05 np0005546954 systemd[1]: session-27.scope: Deactivated successfully.
Dec  5 07:37:05 np0005546954 systemd[1]: session-27.scope: Consumed 1min 28.685s CPU time.
Dec  5 07:37:05 np0005546954 systemd-logind[789]: Session 27 logged out. Waiting for processes to exit.
Dec  5 07:37:05 np0005546954 systemd-logind[789]: Removed session 27.
Dec  5 07:37:05 np0005546954 podman[197513]: time="2025-12-05T12:37:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:37:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:37:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 07:37:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:37:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2554 "" "Go-http-client/1.1"
Dec  5 07:37:10 np0005546954 podman[207326]: 2025-12-05 12:37:10.580543775 +0000 UTC m=+0.074869745 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec  5 07:37:10 np0005546954 podman[207325]: 2025-12-05 12:37:10.593602261 +0000 UTC m=+0.083435132 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., io.openshift.expose-services=, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-type=git, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41)
Dec  5 07:37:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:37:16.931 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:37:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:37:16.936 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.007s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:37:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:37:16.937 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:37:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:37:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 07:37:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:37:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:37:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:37:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:37:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:37:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 07:37:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:37:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:37:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 07:37:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:37:23 np0005546954 podman[207372]: 2025-12-05 12:37:23.553003954 +0000 UTC m=+0.068963561 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  5 07:37:27 np0005546954 podman[207393]: 2025-12-05 12:37:27.54732296 +0000 UTC m=+0.058775363 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 07:37:29 np0005546954 podman[207418]: 2025-12-05 12:37:29.585507547 +0000 UTC m=+0.088818600 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:37:35 np0005546954 podman[197513]: time="2025-12-05T12:37:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:37:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:37:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 07:37:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:37:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2547 "" "Go-http-client/1.1"
Dec  5 07:37:41 np0005546954 podman[207448]: 2025-12-05 12:37:41.565438216 +0000 UTC m=+0.066241246 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec  5 07:37:41 np0005546954 podman[207447]: 2025-12-05 12:37:41.583341835 +0000 UTC m=+0.089801071 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, build-date=2025-08-20T13:12:41, vcs-type=git, version=9.6, managed_by=edpm_ansible, vendor=Red Hat, Inc., container_name=openstack_network_exporter, release=1755695350, architecture=x86_64)
Dec  5 07:37:46 np0005546954 nova_compute[187160]: 2025-12-05 12:37:46.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:37:46 np0005546954 nova_compute[187160]: 2025-12-05 12:37:46.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:37:46 np0005546954 nova_compute[187160]: 2025-12-05 12:37:46.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:37:46 np0005546954 nova_compute[187160]: 2025-12-05 12:37:46.069 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 07:37:46 np0005546954 nova_compute[187160]: 2025-12-05 12:37:46.069 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:37:46 np0005546954 nova_compute[187160]: 2025-12-05 12:37:46.070 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:37:47 np0005546954 nova_compute[187160]: 2025-12-05 12:37:47.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:37:47 np0005546954 nova_compute[187160]: 2025-12-05 12:37:47.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:37:47 np0005546954 nova_compute[187160]: 2025-12-05 12:37:47.084 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:37:47 np0005546954 nova_compute[187160]: 2025-12-05 12:37:47.084 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:37:47 np0005546954 nova_compute[187160]: 2025-12-05 12:37:47.084 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:37:47 np0005546954 nova_compute[187160]: 2025-12-05 12:37:47.085 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:37:47 np0005546954 nova_compute[187160]: 2025-12-05 12:37:47.282 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:37:47 np0005546954 nova_compute[187160]: 2025-12-05 12:37:47.284 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6147MB free_disk=73.3745002746582GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:37:47 np0005546954 nova_compute[187160]: 2025-12-05 12:37:47.284 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:37:47 np0005546954 nova_compute[187160]: 2025-12-05 12:37:47.284 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:37:47 np0005546954 nova_compute[187160]: 2025-12-05 12:37:47.360 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:37:47 np0005546954 nova_compute[187160]: 2025-12-05 12:37:47.361 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:37:47 np0005546954 nova_compute[187160]: 2025-12-05 12:37:47.390 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:37:47 np0005546954 nova_compute[187160]: 2025-12-05 12:37:47.408 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:37:47 np0005546954 nova_compute[187160]: 2025-12-05 12:37:47.410 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:37:47 np0005546954 nova_compute[187160]: 2025-12-05 12:37:47.411 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:37:48 np0005546954 nova_compute[187160]: 2025-12-05 12:37:48.406 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:37:48 np0005546954 nova_compute[187160]: 2025-12-05 12:37:48.407 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:37:48 np0005546954 nova_compute[187160]: 2025-12-05 12:37:48.408 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:37:49 np0005546954 nova_compute[187160]: 2025-12-05 12:37:49.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:37:50 np0005546954 nova_compute[187160]: 2025-12-05 12:37:50.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:37:54 np0005546954 podman[207484]: 2025-12-05 12:37:54.548379042 +0000 UTC m=+0.052980552 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  5 07:37:58 np0005546954 podman[207503]: 2025-12-05 12:37:58.564524279 +0000 UTC m=+0.066276037 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 07:38:00 np0005546954 podman[207527]: 2025-12-05 12:38:00.592276762 +0000 UTC m=+0.109221526 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Dec  5 07:38:13 np0005546954 podman[207555]: 2025-12-05 12:38:13.135746572 +0000 UTC m=+0.067334712 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, build-date=2025-08-20T13:12:41, distribution-scope=public, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, release=1755695350, io.buildah.version=1.33.7, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, maintainer=Red Hat, Inc.)
Dec  5 07:38:13 np0005546954 podman[207556]: 2025-12-05 12:38:13.154631141 +0000 UTC m=+0.082213154 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  5 07:38:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:38:16.930 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:38:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:38:16.931 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:38:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:38:16.931 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:38:17 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:38:17.728 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2a:56:46', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '32:90:88:ab:74:32'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:38:17 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:38:17.729 104428 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 07:38:17 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:38:17.732 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f9f74c-08f9-451f-9678-93bb9e8fa2fe, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:38:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:38:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 07:38:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:38:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:38:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:38:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:38:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:38:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 07:38:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:38:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:38:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 07:38:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:38:25 np0005546954 podman[207596]: 2025-12-05 12:38:25.568162662 +0000 UTC m=+0.079411875 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125)
Dec  5 07:38:29 np0005546954 podman[207615]: 2025-12-05 12:38:29.582353618 +0000 UTC m=+0.082826424 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  5 07:38:31 np0005546954 podman[207639]: 2025-12-05 12:38:31.60476107 +0000 UTC m=+0.107380250 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:38:35 np0005546954 podman[197513]: time="2025-12-05T12:38:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:38:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:38:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 07:38:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:38:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2564 "" "Go-http-client/1.1"
Dec  5 07:38:43 np0005546954 podman[207666]: 2025-12-05 12:38:43.565864017 +0000 UTC m=+0.068345925 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec  5 07:38:43 np0005546954 podman[207665]: 2025-12-05 12:38:43.576185004 +0000 UTC m=+0.088123700 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, name=ubi9-minimal)
Dec  5 07:38:46 np0005546954 nova_compute[187160]: 2025-12-05 12:38:46.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:38:47 np0005546954 nova_compute[187160]: 2025-12-05 12:38:47.037 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:38:47 np0005546954 nova_compute[187160]: 2025-12-05 12:38:47.311 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:38:47 np0005546954 nova_compute[187160]: 2025-12-05 12:38:47.313 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:38:47 np0005546954 nova_compute[187160]: 2025-12-05 12:38:47.313 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:38:47 np0005546954 nova_compute[187160]: 2025-12-05 12:38:47.449 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:38:47 np0005546954 nova_compute[187160]: 2025-12-05 12:38:47.450 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:38:47 np0005546954 nova_compute[187160]: 2025-12-05 12:38:47.450 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:38:47 np0005546954 nova_compute[187160]: 2025-12-05 12:38:47.450 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:38:47 np0005546954 nova_compute[187160]: 2025-12-05 12:38:47.601 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:38:47 np0005546954 nova_compute[187160]: 2025-12-05 12:38:47.602 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6162MB free_disk=73.37448120117188GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:38:47 np0005546954 nova_compute[187160]: 2025-12-05 12:38:47.602 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:38:47 np0005546954 nova_compute[187160]: 2025-12-05 12:38:47.603 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:38:47 np0005546954 nova_compute[187160]: 2025-12-05 12:38:47.771 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:38:47 np0005546954 nova_compute[187160]: 2025-12-05 12:38:47.772 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:38:47 np0005546954 nova_compute[187160]: 2025-12-05 12:38:47.807 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:38:48 np0005546954 nova_compute[187160]: 2025-12-05 12:38:48.040 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:38:48 np0005546954 nova_compute[187160]: 2025-12-05 12:38:48.043 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:38:48 np0005546954 nova_compute[187160]: 2025-12-05 12:38:48.044 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.441s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:38:48 np0005546954 nova_compute[187160]: 2025-12-05 12:38:48.770 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:38:48 np0005546954 nova_compute[187160]: 2025-12-05 12:38:48.771 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:38:48 np0005546954 nova_compute[187160]: 2025-12-05 12:38:48.772 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:38:49 np0005546954 nova_compute[187160]: 2025-12-05 12:38:49.340 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 07:38:49 np0005546954 nova_compute[187160]: 2025-12-05 12:38:49.342 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:38:49 np0005546954 nova_compute[187160]: 2025-12-05 12:38:49.343 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:38:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:38:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:38:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:38:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:38:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:38:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 07:38:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:38:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 07:38:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:38:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:38:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 07:38:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:38:50 np0005546954 nova_compute[187160]: 2025-12-05 12:38:50.043 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:38:50 np0005546954 nova_compute[187160]: 2025-12-05 12:38:50.047 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:38:50 np0005546954 nova_compute[187160]: 2025-12-05 12:38:50.047 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:38:56 np0005546954 podman[207707]: 2025-12-05 12:38:56.544435365 +0000 UTC m=+0.056940324 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec  5 07:39:00 np0005546954 podman[207724]: 2025-12-05 12:39:00.540532747 +0000 UTC m=+0.050504800 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  5 07:39:02 np0005546954 podman[207748]: 2025-12-05 12:39:02.594962123 +0000 UTC m=+0.101971419 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  5 07:39:05 np0005546954 podman[197513]: time="2025-12-05T12:39:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:39:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:39:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 07:39:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:39:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2568 "" "Go-http-client/1.1"
Dec  5 07:39:14 np0005546954 podman[207775]: 2025-12-05 12:39:14.535657369 +0000 UTC m=+0.050896167 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, distribution-scope=public, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., name=ubi9-minimal, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec  5 07:39:14 np0005546954 podman[207776]: 2025-12-05 12:39:14.549287187 +0000 UTC m=+0.059568799 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible)
Dec  5 07:39:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:39:16.932 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:39:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:39:16.934 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:39:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:39:16.935 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:39:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:39:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 07:39:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:39:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:39:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:39:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:39:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:39:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 07:39:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:39:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:39:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 07:39:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:39:27 np0005546954 podman[207814]: 2025-12-05 12:39:27.569130004 +0000 UTC m=+0.079775432 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec  5 07:39:31 np0005546954 podman[207837]: 2025-12-05 12:39:31.550493232 +0000 UTC m=+0.055848472 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  5 07:39:33 np0005546954 podman[207863]: 2025-12-05 12:39:33.594701991 +0000 UTC m=+0.094576337 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec  5 07:39:35 np0005546954 podman[197513]: time="2025-12-05T12:39:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:39:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:39:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 07:39:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:39:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2558 "" "Go-http-client/1.1"
Dec  5 07:39:45 np0005546954 podman[207891]: 2025-12-05 12:39:45.559344077 +0000 UTC m=+0.062102079 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec  5 07:39:45 np0005546954 podman[207890]: 2025-12-05 12:39:45.559387798 +0000 UTC m=+0.065162134 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, config_id=edpm, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, distribution-scope=public, io.openshift.expose-services=, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, release=1755695350, name=ubi9-minimal, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec  5 07:39:46 np0005546954 nova_compute[187160]: 2025-12-05 12:39:46.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:39:46 np0005546954 nova_compute[187160]: 2025-12-05 12:39:46.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:39:46 np0005546954 nova_compute[187160]: 2025-12-05 12:39:46.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  5 07:39:46 np0005546954 nova_compute[187160]: 2025-12-05 12:39:46.928 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  5 07:39:46 np0005546954 nova_compute[187160]: 2025-12-05 12:39:46.929 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:39:46 np0005546954 nova_compute[187160]: 2025-12-05 12:39:46.929 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  5 07:39:47 np0005546954 nova_compute[187160]: 2025-12-05 12:39:47.044 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:39:48 np0005546954 nova_compute[187160]: 2025-12-05 12:39:48.096 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:39:48 np0005546954 nova_compute[187160]: 2025-12-05 12:39:48.097 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:39:48 np0005546954 nova_compute[187160]: 2025-12-05 12:39:48.097 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:39:48 np0005546954 nova_compute[187160]: 2025-12-05 12:39:48.113 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 07:39:48 np0005546954 nova_compute[187160]: 2025-12-05 12:39:48.113 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:39:48 np0005546954 nova_compute[187160]: 2025-12-05 12:39:48.114 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:39:48 np0005546954 nova_compute[187160]: 2025-12-05 12:39:48.146 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:39:48 np0005546954 nova_compute[187160]: 2025-12-05 12:39:48.147 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:39:48 np0005546954 nova_compute[187160]: 2025-12-05 12:39:48.147 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:39:48 np0005546954 nova_compute[187160]: 2025-12-05 12:39:48.147 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:39:48 np0005546954 nova_compute[187160]: 2025-12-05 12:39:48.294 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:39:48 np0005546954 nova_compute[187160]: 2025-12-05 12:39:48.296 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6167MB free_disk=73.37448120117188GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:39:48 np0005546954 nova_compute[187160]: 2025-12-05 12:39:48.296 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:39:48 np0005546954 nova_compute[187160]: 2025-12-05 12:39:48.296 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:39:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:39:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:39:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:39:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 07:39:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:39:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:39:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:39:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 07:39:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:39:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:39:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 07:39:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:39:49 np0005546954 nova_compute[187160]: 2025-12-05 12:39:49.762 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:39:49 np0005546954 nova_compute[187160]: 2025-12-05 12:39:49.763 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:39:49 np0005546954 nova_compute[187160]: 2025-12-05 12:39:49.851 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Refreshing inventories for resource provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  5 07:39:49 np0005546954 nova_compute[187160]: 2025-12-05 12:39:49.910 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Updating ProviderTree inventory for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  5 07:39:49 np0005546954 nova_compute[187160]: 2025-12-05 12:39:49.911 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Updating inventory in ProviderTree for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  5 07:39:49 np0005546954 nova_compute[187160]: 2025-12-05 12:39:49.931 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Refreshing aggregate associations for resource provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  5 07:39:49 np0005546954 nova_compute[187160]: 2025-12-05 12:39:49.976 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Refreshing trait associations for resource provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b, traits: COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_IDE,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_2_0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  5 07:39:50 np0005546954 nova_compute[187160]: 2025-12-05 12:39:50.000 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:39:50 np0005546954 nova_compute[187160]: 2025-12-05 12:39:50.637 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:39:50 np0005546954 nova_compute[187160]: 2025-12-05 12:39:50.638 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:39:50 np0005546954 nova_compute[187160]: 2025-12-05 12:39:50.638 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.342s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:39:51 np0005546954 nova_compute[187160]: 2025-12-05 12:39:51.564 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:39:51 np0005546954 nova_compute[187160]: 2025-12-05 12:39:51.565 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:39:51 np0005546954 nova_compute[187160]: 2025-12-05 12:39:51.565 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:39:51 np0005546954 nova_compute[187160]: 2025-12-05 12:39:51.565 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:39:51 np0005546954 nova_compute[187160]: 2025-12-05 12:39:51.566 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:39:52 np0005546954 nova_compute[187160]: 2025-12-05 12:39:52.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:39:58 np0005546954 podman[207929]: 2025-12-05 12:39:58.544774156 +0000 UTC m=+0.055767699 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec  5 07:40:02 np0005546954 podman[207949]: 2025-12-05 12:40:02.58439312 +0000 UTC m=+0.085793141 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 07:40:04 np0005546954 podman[207973]: 2025-12-05 12:40:04.599450443 +0000 UTC m=+0.104381295 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec  5 07:40:05 np0005546954 podman[197513]: time="2025-12-05T12:40:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:40:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:40:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 07:40:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:40:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2560 "" "Go-http-client/1.1"
Dec  5 07:40:16 np0005546954 podman[208000]: 2025-12-05 12:40:16.555374282 +0000 UTC m=+0.067350853 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, architecture=x86_64, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, release=1755695350, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec  5 07:40:16 np0005546954 podman[208001]: 2025-12-05 12:40:16.561296838 +0000 UTC m=+0.068209529 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  5 07:40:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:40:16.934 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:40:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:40:16.936 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:40:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:40:16.936 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:40:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:40:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:40:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:40:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:40:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:40:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 07:40:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:40:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 07:40:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:40:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:40:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 07:40:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:40:29 np0005546954 podman[208041]: 2025-12-05 12:40:29.551641431 +0000 UTC m=+0.064573006 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  5 07:40:33 np0005546954 podman[208061]: 2025-12-05 12:40:33.540993146 +0000 UTC m=+0.052181947 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 07:40:35 np0005546954 podman[208086]: 2025-12-05 12:40:35.572219625 +0000 UTC m=+0.088664501 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Dec  5 07:40:35 np0005546954 podman[197513]: time="2025-12-05T12:40:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:40:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:40:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 07:40:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:40:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2566 "" "Go-http-client/1.1"
Dec  5 07:40:47 np0005546954 podman[208114]: 2025-12-05 12:40:47.545626408 +0000 UTC m=+0.058807135 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, container_name=openstack_network_exporter, vcs-type=git, config_id=edpm, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal)
Dec  5 07:40:47 np0005546954 podman[208115]: 2025-12-05 12:40:47.546931748 +0000 UTC m=+0.058348370 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd)
Dec  5 07:40:48 np0005546954 nova_compute[187160]: 2025-12-05 12:40:48.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:40:49 np0005546954 nova_compute[187160]: 2025-12-05 12:40:49.034 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:40:49 np0005546954 nova_compute[187160]: 2025-12-05 12:40:49.053 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:40:49 np0005546954 nova_compute[187160]: 2025-12-05 12:40:49.053 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:40:49 np0005546954 nova_compute[187160]: 2025-12-05 12:40:49.053 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:40:49 np0005546954 nova_compute[187160]: 2025-12-05 12:40:49.109 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 07:40:49 np0005546954 nova_compute[187160]: 2025-12-05 12:40:49.109 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:40:49 np0005546954 nova_compute[187160]: 2025-12-05 12:40:49.109 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:40:49 np0005546954 nova_compute[187160]: 2025-12-05 12:40:49.109 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:40:49 np0005546954 nova_compute[187160]: 2025-12-05 12:40:49.300 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:40:49 np0005546954 nova_compute[187160]: 2025-12-05 12:40:49.301 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:40:49 np0005546954 nova_compute[187160]: 2025-12-05 12:40:49.301 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:40:49 np0005546954 nova_compute[187160]: 2025-12-05 12:40:49.301 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:40:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:40:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 07:40:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:40:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:40:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:40:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:40:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:40:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 07:40:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:40:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:40:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 07:40:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:40:49 np0005546954 nova_compute[187160]: 2025-12-05 12:40:49.458 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:40:49 np0005546954 nova_compute[187160]: 2025-12-05 12:40:49.459 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6182MB free_disk=73.3745002746582GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:40:49 np0005546954 nova_compute[187160]: 2025-12-05 12:40:49.459 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:40:49 np0005546954 nova_compute[187160]: 2025-12-05 12:40:49.459 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:40:49 np0005546954 nova_compute[187160]: 2025-12-05 12:40:49.514 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:40:49 np0005546954 nova_compute[187160]: 2025-12-05 12:40:49.514 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:40:49 np0005546954 nova_compute[187160]: 2025-12-05 12:40:49.533 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:40:49 np0005546954 nova_compute[187160]: 2025-12-05 12:40:49.549 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:40:49 np0005546954 nova_compute[187160]: 2025-12-05 12:40:49.550 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:40:49 np0005546954 nova_compute[187160]: 2025-12-05 12:40:49.550 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.091s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:40:52 np0005546954 nova_compute[187160]: 2025-12-05 12:40:52.480 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:40:52 np0005546954 nova_compute[187160]: 2025-12-05 12:40:52.481 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:40:52 np0005546954 nova_compute[187160]: 2025-12-05 12:40:52.481 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:40:53 np0005546954 nova_compute[187160]: 2025-12-05 12:40:53.035 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:40:53 np0005546954 nova_compute[187160]: 2025-12-05 12:40:53.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:41:00 np0005546954 podman[208154]: 2025-12-05 12:41:00.541023469 +0000 UTC m=+0.049737661 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:41:04 np0005546954 podman[208173]: 2025-12-05 12:41:04.540056678 +0000 UTC m=+0.049974279 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  5 07:41:05 np0005546954 podman[197513]: time="2025-12-05T12:41:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:41:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:41:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 07:41:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:41:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2566 "" "Go-http-client/1.1"
Dec  5 07:41:06 np0005546954 podman[208198]: 2025-12-05 12:41:06.579880626 +0000 UTC m=+0.091127768 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec  5 07:41:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:41:16.936 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:41:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:41:16.936 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:41:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:41:16.937 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:41:18 np0005546954 podman[208225]: 2025-12-05 12:41:18.549356796 +0000 UTC m=+0.060401833 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, managed_by=edpm_ansible, vendor=Red Hat, Inc., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41)
Dec  5 07:41:18 np0005546954 podman[208226]: 2025-12-05 12:41:18.561074105 +0000 UTC m=+0.067203768 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  5 07:41:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:41:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:41:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:41:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:41:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:41:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 07:41:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:41:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 07:41:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:41:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:41:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 07:41:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:41:31 np0005546954 podman[208268]: 2025-12-05 12:41:31.564078247 +0000 UTC m=+0.064758820 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec  5 07:41:35 np0005546954 podman[208288]: 2025-12-05 12:41:35.549375639 +0000 UTC m=+0.063177261 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  5 07:41:35 np0005546954 podman[197513]: time="2025-12-05T12:41:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:41:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:41:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 07:41:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:41:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2569 "" "Go-http-client/1.1"
Dec  5 07:41:37 np0005546954 podman[208313]: 2025-12-05 12:41:37.615350598 +0000 UTC m=+0.123420208 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible)
Dec  5 07:41:49 np0005546954 nova_compute[187160]: 2025-12-05 12:41:49.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:41:49 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:41:49.081 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2a:56:46', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '32:90:88:ab:74:32'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:41:49 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:41:49.083 104428 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 07:41:49 np0005546954 podman[208340]: 2025-12-05 12:41:49.134287639 +0000 UTC m=+0.060236218 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.buildah.version=1.33.7, config_id=edpm, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible)
Dec  5 07:41:49 np0005546954 podman[208341]: 2025-12-05 12:41:49.145206743 +0000 UTC m=+0.063588944 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, container_name=multipathd)
Dec  5 07:41:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:41:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:41:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:41:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 07:41:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:41:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:41:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:41:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 07:41:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:41:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:41:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 07:41:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:41:50 np0005546954 nova_compute[187160]: 2025-12-05 12:41:50.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:41:50 np0005546954 nova_compute[187160]: 2025-12-05 12:41:50.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:41:50 np0005546954 nova_compute[187160]: 2025-12-05 12:41:50.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:41:50 np0005546954 nova_compute[187160]: 2025-12-05 12:41:50.056 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 07:41:50 np0005546954 nova_compute[187160]: 2025-12-05 12:41:50.056 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:41:50 np0005546954 nova_compute[187160]: 2025-12-05 12:41:50.057 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:41:51 np0005546954 nova_compute[187160]: 2025-12-05 12:41:51.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:41:51 np0005546954 nova_compute[187160]: 2025-12-05 12:41:51.068 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:41:51 np0005546954 nova_compute[187160]: 2025-12-05 12:41:51.068 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:41:51 np0005546954 nova_compute[187160]: 2025-12-05 12:41:51.068 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:41:51 np0005546954 nova_compute[187160]: 2025-12-05 12:41:51.069 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:41:51 np0005546954 nova_compute[187160]: 2025-12-05 12:41:51.205 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:41:51 np0005546954 nova_compute[187160]: 2025-12-05 12:41:51.206 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6178MB free_disk=73.37451553344727GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:41:51 np0005546954 nova_compute[187160]: 2025-12-05 12:41:51.206 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:41:51 np0005546954 nova_compute[187160]: 2025-12-05 12:41:51.206 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:41:51 np0005546954 nova_compute[187160]: 2025-12-05 12:41:51.273 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:41:51 np0005546954 nova_compute[187160]: 2025-12-05 12:41:51.274 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:41:51 np0005546954 nova_compute[187160]: 2025-12-05 12:41:51.308 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:41:51 np0005546954 nova_compute[187160]: 2025-12-05 12:41:51.322 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:41:51 np0005546954 nova_compute[187160]: 2025-12-05 12:41:51.323 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:41:51 np0005546954 nova_compute[187160]: 2025-12-05 12:41:51.324 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:41:52 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:41:52.087 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f9f74c-08f9-451f-9678-93bb9e8fa2fe, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:41:52 np0005546954 nova_compute[187160]: 2025-12-05 12:41:52.324 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:41:52 np0005546954 nova_compute[187160]: 2025-12-05 12:41:52.325 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:41:53 np0005546954 nova_compute[187160]: 2025-12-05 12:41:53.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:41:54 np0005546954 nova_compute[187160]: 2025-12-05 12:41:54.033 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:41:55 np0005546954 nova_compute[187160]: 2025-12-05 12:41:55.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:42:02 np0005546954 podman[208383]: 2025-12-05 12:42:02.558274121 +0000 UTC m=+0.061549879 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Dec  5 07:42:05 np0005546954 podman[197513]: time="2025-12-05T12:42:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:42:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:42:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 07:42:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:42:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2562 "" "Go-http-client/1.1"
Dec  5 07:42:06 np0005546954 podman[208403]: 2025-12-05 12:42:06.554096164 +0000 UTC m=+0.061818268 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  5 07:42:08 np0005546954 podman[208428]: 2025-12-05 12:42:08.592043521 +0000 UTC m=+0.105069260 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec  5 07:42:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:42:16.938 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:42:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:42:16.939 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:42:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:42:16.940 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:42:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:42:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 07:42:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:42:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:42:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:42:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:42:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:42:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 07:42:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:42:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:42:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 07:42:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:42:19 np0005546954 podman[208455]: 2025-12-05 12:42:19.544173512 +0000 UTC m=+0.054847929 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec  5 07:42:19 np0005546954 podman[208454]: 2025-12-05 12:42:19.553057841 +0000 UTC m=+0.067045442 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, release=1755695350)
Dec  5 07:42:33 np0005546954 podman[208496]: 2025-12-05 12:42:33.553290423 +0000 UTC m=+0.059482151 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  5 07:42:35 np0005546954 podman[197513]: time="2025-12-05T12:42:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:42:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:42:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 07:42:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:42:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2569 "" "Go-http-client/1.1"
Dec  5 07:42:37 np0005546954 podman[208513]: 2025-12-05 12:42:37.558266107 +0000 UTC m=+0.063108313 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  5 07:42:39 np0005546954 podman[208537]: 2025-12-05 12:42:39.575446989 +0000 UTC m=+0.091530001 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  5 07:42:49 np0005546954 nova_compute[187160]: 2025-12-05 12:42:49.034 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:42:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:42:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:42:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:42:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 07:42:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:42:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:42:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:42:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 07:42:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:42:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:42:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 07:42:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:42:50 np0005546954 nova_compute[187160]: 2025-12-05 12:42:50.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:42:50 np0005546954 nova_compute[187160]: 2025-12-05 12:42:50.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:42:50 np0005546954 nova_compute[187160]: 2025-12-05 12:42:50.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:42:50 np0005546954 podman[208565]: 2025-12-05 12:42:50.547165639 +0000 UTC m=+0.051834909 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  5 07:42:50 np0005546954 podman[208564]: 2025-12-05 12:42:50.550898826 +0000 UTC m=+0.057809826 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, vcs-type=git, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Dec  5 07:42:51 np0005546954 nova_compute[187160]: 2025-12-05 12:42:51.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:42:51 np0005546954 nova_compute[187160]: 2025-12-05 12:42:51.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:42:51 np0005546954 nova_compute[187160]: 2025-12-05 12:42:51.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:42:51 np0005546954 nova_compute[187160]: 2025-12-05 12:42:51.066 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 07:42:52 np0005546954 nova_compute[187160]: 2025-12-05 12:42:52.475 187164 DEBUG oslo_concurrency.lockutils [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Acquiring lock "2bf13a3e-bb2a-45f0-893e-0eb33fedb85e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:42:52 np0005546954 nova_compute[187160]: 2025-12-05 12:42:52.476 187164 DEBUG oslo_concurrency.lockutils [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "2bf13a3e-bb2a-45f0-893e-0eb33fedb85e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:42:52 np0005546954 nova_compute[187160]: 2025-12-05 12:42:52.494 187164 DEBUG nova.compute.manager [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:42:52 np0005546954 nova_compute[187160]: 2025-12-05 12:42:52.568 187164 DEBUG oslo_concurrency.lockutils [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:42:52 np0005546954 nova_compute[187160]: 2025-12-05 12:42:52.568 187164 DEBUG oslo_concurrency.lockutils [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:42:52 np0005546954 nova_compute[187160]: 2025-12-05 12:42:52.574 187164 DEBUG nova.virt.hardware [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:42:52 np0005546954 nova_compute[187160]: 2025-12-05 12:42:52.574 187164 INFO nova.compute.claims [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Claim successful on node compute-1.ctlplane.example.com#033[00m
Dec  5 07:42:53 np0005546954 nova_compute[187160]: 2025-12-05 12:42:53.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:42:53 np0005546954 nova_compute[187160]: 2025-12-05 12:42:53.077 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:42:53 np0005546954 nova_compute[187160]: 2025-12-05 12:42:53.129 187164 DEBUG nova.compute.provider_tree [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:42:53 np0005546954 nova_compute[187160]: 2025-12-05 12:42:53.146 187164 DEBUG nova.scheduler.client.report [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:42:53 np0005546954 nova_compute[187160]: 2025-12-05 12:42:53.183 187164 DEBUG oslo_concurrency.lockutils [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.615s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:42:53 np0005546954 nova_compute[187160]: 2025-12-05 12:42:53.184 187164 DEBUG nova.compute.manager [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:42:53 np0005546954 nova_compute[187160]: 2025-12-05 12:42:53.188 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:42:53 np0005546954 nova_compute[187160]: 2025-12-05 12:42:53.188 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:42:53 np0005546954 nova_compute[187160]: 2025-12-05 12:42:53.188 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:42:53 np0005546954 nova_compute[187160]: 2025-12-05 12:42:53.241 187164 DEBUG nova.compute.manager [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:42:53 np0005546954 nova_compute[187160]: 2025-12-05 12:42:53.242 187164 DEBUG nova.network.neutron [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:42:53 np0005546954 nova_compute[187160]: 2025-12-05 12:42:53.264 187164 INFO nova.virt.libvirt.driver [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:42:53 np0005546954 nova_compute[187160]: 2025-12-05 12:42:53.286 187164 DEBUG nova.compute.manager [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:42:53 np0005546954 nova_compute[187160]: 2025-12-05 12:42:53.371 187164 DEBUG nova.compute.manager [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:42:53 np0005546954 nova_compute[187160]: 2025-12-05 12:42:53.373 187164 DEBUG nova.virt.libvirt.driver [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:42:53 np0005546954 nova_compute[187160]: 2025-12-05 12:42:53.374 187164 INFO nova.virt.libvirt.driver [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Creating image(s)#033[00m
Dec  5 07:42:53 np0005546954 nova_compute[187160]: 2025-12-05 12:42:53.375 187164 DEBUG oslo_concurrency.lockutils [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Acquiring lock "/var/lib/nova/instances/2bf13a3e-bb2a-45f0-893e-0eb33fedb85e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:42:53 np0005546954 nova_compute[187160]: 2025-12-05 12:42:53.375 187164 DEBUG oslo_concurrency.lockutils [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "/var/lib/nova/instances/2bf13a3e-bb2a-45f0-893e-0eb33fedb85e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:42:53 np0005546954 nova_compute[187160]: 2025-12-05 12:42:53.376 187164 DEBUG oslo_concurrency.lockutils [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "/var/lib/nova/instances/2bf13a3e-bb2a-45f0-893e-0eb33fedb85e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:42:53 np0005546954 nova_compute[187160]: 2025-12-05 12:42:53.376 187164 DEBUG oslo_concurrency.lockutils [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Acquiring lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:42:53 np0005546954 nova_compute[187160]: 2025-12-05 12:42:53.377 187164 DEBUG oslo_concurrency.lockutils [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:42:53 np0005546954 nova_compute[187160]: 2025-12-05 12:42:53.411 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:42:53 np0005546954 nova_compute[187160]: 2025-12-05 12:42:53.412 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6193MB free_disk=73.37458419799805GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:42:53 np0005546954 nova_compute[187160]: 2025-12-05 12:42:53.412 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:42:53 np0005546954 nova_compute[187160]: 2025-12-05 12:42:53.412 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:42:53 np0005546954 nova_compute[187160]: 2025-12-05 12:42:53.481 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Instance 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:42:53 np0005546954 nova_compute[187160]: 2025-12-05 12:42:53.481 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:42:53 np0005546954 nova_compute[187160]: 2025-12-05 12:42:53.481 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:42:53 np0005546954 nova_compute[187160]: 2025-12-05 12:42:53.532 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:42:53 np0005546954 nova_compute[187160]: 2025-12-05 12:42:53.550 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:42:53 np0005546954 nova_compute[187160]: 2025-12-05 12:42:53.597 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:42:53 np0005546954 nova_compute[187160]: 2025-12-05 12:42:53.598 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.186s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:42:54 np0005546954 nova_compute[187160]: 2025-12-05 12:42:54.216 187164 WARNING oslo_policy.policy [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Dec  5 07:42:54 np0005546954 nova_compute[187160]: 2025-12-05 12:42:54.216 187164 WARNING oslo_policy.policy [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Dec  5 07:42:54 np0005546954 nova_compute[187160]: 2025-12-05 12:42:54.218 187164 DEBUG nova.policy [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7ce7ef64754e4a32b4af3272e31a4a5e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6b5f383ed0484ca1bde081bf623dad4b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:42:54 np0005546954 nova_compute[187160]: 2025-12-05 12:42:54.604 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:42:54 np0005546954 nova_compute[187160]: 2025-12-05 12:42:54.605 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:42:55 np0005546954 nova_compute[187160]: 2025-12-05 12:42:55.035 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:42:55 np0005546954 nova_compute[187160]: 2025-12-05 12:42:55.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:42:55 np0005546954 nova_compute[187160]: 2025-12-05 12:42:55.090 187164 DEBUG oslo_concurrency.processutils [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:42:55 np0005546954 nova_compute[187160]: 2025-12-05 12:42:55.155 187164 DEBUG oslo_concurrency.processutils [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4.part --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:42:55 np0005546954 nova_compute[187160]: 2025-12-05 12:42:55.157 187164 DEBUG nova.virt.images [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] f4c3125a-6fd0-40bb-aa00-a7e736ee853d was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Dec  5 07:42:55 np0005546954 nova_compute[187160]: 2025-12-05 12:42:55.158 187164 DEBUG nova.privsep.utils [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Dec  5 07:42:55 np0005546954 nova_compute[187160]: 2025-12-05 12:42:55.159 187164 DEBUG oslo_concurrency.processutils [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4.part /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:42:55 np0005546954 nova_compute[187160]: 2025-12-05 12:42:55.254 187164 DEBUG nova.network.neutron [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Successfully created port: f9c0965a-861e-4c24-9c97-679c6d706267 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:42:55 np0005546954 nova_compute[187160]: 2025-12-05 12:42:55.342 187164 DEBUG oslo_concurrency.processutils [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4.part /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4.converted" returned: 0 in 0.183s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:42:55 np0005546954 nova_compute[187160]: 2025-12-05 12:42:55.352 187164 DEBUG oslo_concurrency.processutils [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:42:55 np0005546954 nova_compute[187160]: 2025-12-05 12:42:55.414 187164 DEBUG oslo_concurrency.processutils [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4.converted --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:42:55 np0005546954 nova_compute[187160]: 2025-12-05 12:42:55.416 187164 DEBUG oslo_concurrency.lockutils [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.039s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:42:55 np0005546954 nova_compute[187160]: 2025-12-05 12:42:55.429 187164 INFO oslo.privsep.daemon [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmp0d4xw7wy/privsep.sock']#033[00m
Dec  5 07:42:56 np0005546954 nova_compute[187160]: 2025-12-05 12:42:56.174 187164 INFO oslo.privsep.daemon [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Spawned new privsep daemon via rootwrap#033[00m
Dec  5 07:42:56 np0005546954 nova_compute[187160]: 2025-12-05 12:42:56.009 208625 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Dec  5 07:42:56 np0005546954 nova_compute[187160]: 2025-12-05 12:42:56.013 208625 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Dec  5 07:42:56 np0005546954 nova_compute[187160]: 2025-12-05 12:42:56.015 208625 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Dec  5 07:42:56 np0005546954 nova_compute[187160]: 2025-12-05 12:42:56.016 208625 INFO oslo.privsep.daemon [-] privsep daemon running as pid 208625#033[00m
Dec  5 07:42:56 np0005546954 nova_compute[187160]: 2025-12-05 12:42:56.212 187164 DEBUG nova.network.neutron [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Successfully updated port: f9c0965a-861e-4c24-9c97-679c6d706267 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:42:56 np0005546954 nova_compute[187160]: 2025-12-05 12:42:56.230 187164 DEBUG oslo_concurrency.lockutils [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Acquiring lock "refresh_cache-2bf13a3e-bb2a-45f0-893e-0eb33fedb85e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:42:56 np0005546954 nova_compute[187160]: 2025-12-05 12:42:56.231 187164 DEBUG oslo_concurrency.lockutils [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Acquired lock "refresh_cache-2bf13a3e-bb2a-45f0-893e-0eb33fedb85e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:42:56 np0005546954 nova_compute[187160]: 2025-12-05 12:42:56.231 187164 DEBUG nova.network.neutron [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:42:56 np0005546954 nova_compute[187160]: 2025-12-05 12:42:56.271 187164 DEBUG oslo_concurrency.processutils [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:42:56 np0005546954 nova_compute[187160]: 2025-12-05 12:42:56.329 187164 DEBUG oslo_concurrency.processutils [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:42:56 np0005546954 nova_compute[187160]: 2025-12-05 12:42:56.330 187164 DEBUG oslo_concurrency.lockutils [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Acquiring lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:42:56 np0005546954 nova_compute[187160]: 2025-12-05 12:42:56.331 187164 DEBUG oslo_concurrency.lockutils [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:42:56 np0005546954 nova_compute[187160]: 2025-12-05 12:42:56.344 187164 DEBUG oslo_concurrency.processutils [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:42:56 np0005546954 nova_compute[187160]: 2025-12-05 12:42:56.364 187164 DEBUG nova.network.neutron [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:42:56 np0005546954 nova_compute[187160]: 2025-12-05 12:42:56.406 187164 DEBUG oslo_concurrency.processutils [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:42:56 np0005546954 nova_compute[187160]: 2025-12-05 12:42:56.407 187164 DEBUG oslo_concurrency.processutils [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/2bf13a3e-bb2a-45f0-893e-0eb33fedb85e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:42:56 np0005546954 nova_compute[187160]: 2025-12-05 12:42:56.448 187164 DEBUG oslo_concurrency.processutils [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/2bf13a3e-bb2a-45f0-893e-0eb33fedb85e/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:42:56 np0005546954 nova_compute[187160]: 2025-12-05 12:42:56.449 187164 DEBUG oslo_concurrency.lockutils [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:42:56 np0005546954 nova_compute[187160]: 2025-12-05 12:42:56.450 187164 DEBUG oslo_concurrency.processutils [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:42:56 np0005546954 nova_compute[187160]: 2025-12-05 12:42:56.509 187164 DEBUG oslo_concurrency.processutils [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:42:56 np0005546954 nova_compute[187160]: 2025-12-05 12:42:56.511 187164 DEBUG nova.virt.disk.api [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Checking if we can resize image /var/lib/nova/instances/2bf13a3e-bb2a-45f0-893e-0eb33fedb85e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:42:56 np0005546954 nova_compute[187160]: 2025-12-05 12:42:56.511 187164 DEBUG oslo_concurrency.processutils [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2bf13a3e-bb2a-45f0-893e-0eb33fedb85e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:42:56 np0005546954 nova_compute[187160]: 2025-12-05 12:42:56.574 187164 DEBUG oslo_concurrency.processutils [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2bf13a3e-bb2a-45f0-893e-0eb33fedb85e/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:42:56 np0005546954 nova_compute[187160]: 2025-12-05 12:42:56.575 187164 DEBUG nova.virt.disk.api [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Cannot resize image /var/lib/nova/instances/2bf13a3e-bb2a-45f0-893e-0eb33fedb85e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:42:56 np0005546954 nova_compute[187160]: 2025-12-05 12:42:56.577 187164 DEBUG nova.objects.instance [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lazy-loading 'migration_context' on Instance uuid 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:42:56 np0005546954 nova_compute[187160]: 2025-12-05 12:42:56.638 187164 DEBUG nova.virt.libvirt.driver [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:42:56 np0005546954 nova_compute[187160]: 2025-12-05 12:42:56.639 187164 DEBUG nova.virt.libvirt.driver [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Ensure instance console log exists: /var/lib/nova/instances/2bf13a3e-bb2a-45f0-893e-0eb33fedb85e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:42:56 np0005546954 nova_compute[187160]: 2025-12-05 12:42:56.640 187164 DEBUG oslo_concurrency.lockutils [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:42:56 np0005546954 nova_compute[187160]: 2025-12-05 12:42:56.641 187164 DEBUG oslo_concurrency.lockutils [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:42:56 np0005546954 nova_compute[187160]: 2025-12-05 12:42:56.642 187164 DEBUG oslo_concurrency.lockutils [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:42:56 np0005546954 nova_compute[187160]: 2025-12-05 12:42:56.673 187164 DEBUG nova.compute.manager [req-29a70d00-72e3-404c-b882-4a1bbb033f55 req-feb1d091-ad14-408b-ac3b-7d4dd859ee4a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Received event network-changed-f9c0965a-861e-4c24-9c97-679c6d706267 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:42:56 np0005546954 nova_compute[187160]: 2025-12-05 12:42:56.673 187164 DEBUG nova.compute.manager [req-29a70d00-72e3-404c-b882-4a1bbb033f55 req-feb1d091-ad14-408b-ac3b-7d4dd859ee4a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Refreshing instance network info cache due to event network-changed-f9c0965a-861e-4c24-9c97-679c6d706267. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:42:56 np0005546954 nova_compute[187160]: 2025-12-05 12:42:56.674 187164 DEBUG oslo_concurrency.lockutils [req-29a70d00-72e3-404c-b882-4a1bbb033f55 req-feb1d091-ad14-408b-ac3b-7d4dd859ee4a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "refresh_cache-2bf13a3e-bb2a-45f0-893e-0eb33fedb85e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.122 187164 DEBUG nova.network.neutron [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Updating instance_info_cache with network_info: [{"id": "f9c0965a-861e-4c24-9c97-679c6d706267", "address": "fa:16:3e:99:40:2c", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9c0965a-86", "ovs_interfaceid": "f9c0965a-861e-4c24-9c97-679c6d706267", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.141 187164 DEBUG oslo_concurrency.lockutils [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Releasing lock "refresh_cache-2bf13a3e-bb2a-45f0-893e-0eb33fedb85e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.142 187164 DEBUG nova.compute.manager [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Instance network_info: |[{"id": "f9c0965a-861e-4c24-9c97-679c6d706267", "address": "fa:16:3e:99:40:2c", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9c0965a-86", "ovs_interfaceid": "f9c0965a-861e-4c24-9c97-679c6d706267", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.143 187164 DEBUG oslo_concurrency.lockutils [req-29a70d00-72e3-404c-b882-4a1bbb033f55 req-feb1d091-ad14-408b-ac3b-7d4dd859ee4a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquired lock "refresh_cache-2bf13a3e-bb2a-45f0-893e-0eb33fedb85e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.143 187164 DEBUG nova.network.neutron [req-29a70d00-72e3-404c-b882-4a1bbb033f55 req-feb1d091-ad14-408b-ac3b-7d4dd859ee4a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Refreshing network info cache for port f9c0965a-861e-4c24-9c97-679c6d706267 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.149 187164 DEBUG nova.virt.libvirt.driver [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Start _get_guest_xml network_info=[{"id": "f9c0965a-861e-4c24-9c97-679c6d706267", "address": "fa:16:3e:99:40:2c", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9c0965a-86", "ovs_interfaceid": "f9c0965a-861e-4c24-9c97-679c6d706267", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T12:39:17Z,direct_url=<?>,disk_format='qcow2',id=f4c3125a-6fd0-40bb-aa00-a7e736ee853d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='83916c53de6f404f91206339303e1b23',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T12:39:21Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'size': 0, 'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'encrypted': False, 'image_id': 'f4c3125a-6fd0-40bb-aa00-a7e736ee853d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.157 187164 WARNING nova.virt.libvirt.driver [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.169 187164 DEBUG nova.virt.libvirt.host [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.170 187164 DEBUG nova.virt.libvirt.host [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.176 187164 DEBUG nova.virt.libvirt.host [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.177 187164 DEBUG nova.virt.libvirt.host [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.180 187164 DEBUG nova.virt.libvirt.driver [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.181 187164 DEBUG nova.virt.hardware [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T12:39:15Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b4ea63be-97f8-4a48-b000-66321c4ddb27',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T12:39:17Z,direct_url=<?>,disk_format='qcow2',id=f4c3125a-6fd0-40bb-aa00-a7e736ee853d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='83916c53de6f404f91206339303e1b23',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T12:39:21Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.182 187164 DEBUG nova.virt.hardware [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.182 187164 DEBUG nova.virt.hardware [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.183 187164 DEBUG nova.virt.hardware [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.183 187164 DEBUG nova.virt.hardware [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.184 187164 DEBUG nova.virt.hardware [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.184 187164 DEBUG nova.virt.hardware [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.185 187164 DEBUG nova.virt.hardware [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.185 187164 DEBUG nova.virt.hardware [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.186 187164 DEBUG nova.virt.hardware [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.186 187164 DEBUG nova.virt.hardware [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.194 187164 DEBUG nova.privsep.utils [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.196 187164 DEBUG nova.virt.libvirt.vif [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:42:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-329469785',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-329469785',id=2,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6b5f383ed0484ca1bde081bf623dad4b',ramdisk_id='',reservation_id='r-trugiq9o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1570363089',owner_user_name='tempest-TestExecuteActionsViaActuator-1570363089-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:42:53Z,user_data=None,user_id='7ce7ef64754e4a32b4af3272e31a4a5e',uuid=2bf13a3e-bb2a-45f0-893e-0eb33fedb85e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f9c0965a-861e-4c24-9c97-679c6d706267", "address": "fa:16:3e:99:40:2c", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9c0965a-86", "ovs_interfaceid": "f9c0965a-861e-4c24-9c97-679c6d706267", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.197 187164 DEBUG nova.network.os_vif_util [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Converting VIF {"id": "f9c0965a-861e-4c24-9c97-679c6d706267", "address": "fa:16:3e:99:40:2c", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9c0965a-86", "ovs_interfaceid": "f9c0965a-861e-4c24-9c97-679c6d706267", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.198 187164 DEBUG nova.network.os_vif_util [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:99:40:2c,bridge_name='br-int',has_traffic_filtering=True,id=f9c0965a-861e-4c24-9c97-679c6d706267,network=Network(ee43e901-b158-4dc0-894f-2384aef8b277),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf9c0965a-86') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.202 187164 DEBUG nova.objects.instance [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lazy-loading 'pci_devices' on Instance uuid 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.219 187164 DEBUG nova.virt.libvirt.driver [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:42:57 np0005546954 nova_compute[187160]:  <uuid>2bf13a3e-bb2a-45f0-893e-0eb33fedb85e</uuid>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:  <name>instance-00000002</name>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:  <memory>131072</memory>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:  <vcpu>1</vcpu>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:  <metadata>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:42:57 np0005546954 nova_compute[187160]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:      <nova:name>tempest-TestExecuteActionsViaActuator-server-329469785</nova:name>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:      <nova:creationTime>2025-12-05 12:42:57</nova:creationTime>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:      <nova:flavor name="m1.nano">
Dec  5 07:42:57 np0005546954 nova_compute[187160]:        <nova:memory>128</nova:memory>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:        <nova:disk>1</nova:disk>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:        <nova:swap>0</nova:swap>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:      </nova:flavor>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:      <nova:owner>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:        <nova:user uuid="7ce7ef64754e4a32b4af3272e31a4a5e">tempest-TestExecuteActionsViaActuator-1570363089-project-member</nova:user>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:        <nova:project uuid="6b5f383ed0484ca1bde081bf623dad4b">tempest-TestExecuteActionsViaActuator-1570363089</nova:project>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:      </nova:owner>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:      <nova:root type="image" uuid="f4c3125a-6fd0-40bb-aa00-a7e736ee853d"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:      <nova:ports>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:        <nova:port uuid="f9c0965a-861e-4c24-9c97-679c6d706267">
Dec  5 07:42:57 np0005546954 nova_compute[187160]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:        </nova:port>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:      </nova:ports>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    </nova:instance>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:  </metadata>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:  <sysinfo type="smbios">
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <system>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:      <entry name="serial">2bf13a3e-bb2a-45f0-893e-0eb33fedb85e</entry>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:      <entry name="uuid">2bf13a3e-bb2a-45f0-893e-0eb33fedb85e</entry>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    </system>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:  </sysinfo>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:  <os>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <boot dev="hd"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <smbios mode="sysinfo"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:  </os>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:  <features>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <acpi/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <apic/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <vmcoreinfo/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:  </features>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:  <clock offset="utc">
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <timer name="hpet" present="no"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:  </clock>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:  <cpu mode="custom" match="exact">
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <model>Nehalem</model>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:  </cpu>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:  <devices>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <disk type="file" device="disk">
Dec  5 07:42:57 np0005546954 nova_compute[187160]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:      <source file="/var/lib/nova/instances/2bf13a3e-bb2a-45f0-893e-0eb33fedb85e/disk"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:      <target dev="vda" bus="virtio"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    </disk>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <disk type="file" device="cdrom">
Dec  5 07:42:57 np0005546954 nova_compute[187160]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:      <source file="/var/lib/nova/instances/2bf13a3e-bb2a-45f0-893e-0eb33fedb85e/disk.config"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:      <target dev="sda" bus="sata"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    </disk>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <interface type="ethernet">
Dec  5 07:42:57 np0005546954 nova_compute[187160]:      <mac address="fa:16:3e:99:40:2c"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:      <model type="virtio"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:      <mtu size="1442"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:      <target dev="tapf9c0965a-86"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    </interface>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <serial type="pty">
Dec  5 07:42:57 np0005546954 nova_compute[187160]:      <log file="/var/lib/nova/instances/2bf13a3e-bb2a-45f0-893e-0eb33fedb85e/console.log" append="off"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    </serial>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <video>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:      <model type="virtio"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    </video>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <input type="tablet" bus="usb"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <rng model="virtio">
Dec  5 07:42:57 np0005546954 nova_compute[187160]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    </rng>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <controller type="usb" index="0"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    <memballoon model="virtio">
Dec  5 07:42:57 np0005546954 nova_compute[187160]:      <stats period="10"/>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:    </memballoon>
Dec  5 07:42:57 np0005546954 nova_compute[187160]:  </devices>
Dec  5 07:42:57 np0005546954 nova_compute[187160]: </domain>
Dec  5 07:42:57 np0005546954 nova_compute[187160]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.221 187164 DEBUG nova.compute.manager [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Preparing to wait for external event network-vif-plugged-f9c0965a-861e-4c24-9c97-679c6d706267 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.221 187164 DEBUG oslo_concurrency.lockutils [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Acquiring lock "2bf13a3e-bb2a-45f0-893e-0eb33fedb85e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.222 187164 DEBUG oslo_concurrency.lockutils [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "2bf13a3e-bb2a-45f0-893e-0eb33fedb85e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.222 187164 DEBUG oslo_concurrency.lockutils [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "2bf13a3e-bb2a-45f0-893e-0eb33fedb85e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.223 187164 DEBUG nova.virt.libvirt.vif [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:42:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-329469785',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-329469785',id=2,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6b5f383ed0484ca1bde081bf623dad4b',ramdisk_id='',reservation_id='r-trugiq9o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1570363089',owner_user_name='tempest-TestExecuteActionsViaActuator-1570363089-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:42:53Z,user_data=None,user_id='7ce7ef64754e4a32b4af3272e31a4a5e',uuid=2bf13a3e-bb2a-45f0-893e-0eb33fedb85e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f9c0965a-861e-4c24-9c97-679c6d706267", "address": "fa:16:3e:99:40:2c", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9c0965a-86", "ovs_interfaceid": "f9c0965a-861e-4c24-9c97-679c6d706267", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.224 187164 DEBUG nova.network.os_vif_util [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Converting VIF {"id": "f9c0965a-861e-4c24-9c97-679c6d706267", "address": "fa:16:3e:99:40:2c", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9c0965a-86", "ovs_interfaceid": "f9c0965a-861e-4c24-9c97-679c6d706267", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.225 187164 DEBUG nova.network.os_vif_util [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:99:40:2c,bridge_name='br-int',has_traffic_filtering=True,id=f9c0965a-861e-4c24-9c97-679c6d706267,network=Network(ee43e901-b158-4dc0-894f-2384aef8b277),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf9c0965a-86') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.226 187164 DEBUG os_vif [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:99:40:2c,bridge_name='br-int',has_traffic_filtering=True,id=f9c0965a-861e-4c24-9c97-679c6d706267,network=Network(ee43e901-b158-4dc0-894f-2384aef8b277),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf9c0965a-86') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.282 187164 DEBUG ovsdbapp.backend.ovs_idl [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.283 187164 DEBUG ovsdbapp.backend.ovs_idl [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.283 187164 DEBUG ovsdbapp.backend.ovs_idl [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.284 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.284 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [POLLOUT] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.284 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.285 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.286 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.289 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.305 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.305 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.306 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:42:57 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.309 187164 INFO oslo.privsep.daemon [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmp29_ej2o6/privsep.sock']#033[00m
Dec  5 07:42:58 np0005546954 nova_compute[187160]: 2025-12-05 12:42:58.052 187164 INFO oslo.privsep.daemon [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Spawned new privsep daemon via rootwrap#033[00m
Dec  5 07:42:58 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.879 208646 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Dec  5 07:42:58 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.887 208646 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Dec  5 07:42:58 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.892 208646 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m
Dec  5 07:42:58 np0005546954 nova_compute[187160]: 2025-12-05 12:42:57.892 208646 INFO oslo.privsep.daemon [-] privsep daemon running as pid 208646#033[00m
Dec  5 07:42:58 np0005546954 nova_compute[187160]: 2025-12-05 12:42:58.111 187164 DEBUG nova.network.neutron [req-29a70d00-72e3-404c-b882-4a1bbb033f55 req-feb1d091-ad14-408b-ac3b-7d4dd859ee4a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Updated VIF entry in instance network info cache for port f9c0965a-861e-4c24-9c97-679c6d706267. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:42:58 np0005546954 nova_compute[187160]: 2025-12-05 12:42:58.112 187164 DEBUG nova.network.neutron [req-29a70d00-72e3-404c-b882-4a1bbb033f55 req-feb1d091-ad14-408b-ac3b-7d4dd859ee4a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Updating instance_info_cache with network_info: [{"id": "f9c0965a-861e-4c24-9c97-679c6d706267", "address": "fa:16:3e:99:40:2c", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9c0965a-86", "ovs_interfaceid": "f9c0965a-861e-4c24-9c97-679c6d706267", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:42:58 np0005546954 nova_compute[187160]: 2025-12-05 12:42:58.127 187164 DEBUG oslo_concurrency.lockutils [req-29a70d00-72e3-404c-b882-4a1bbb033f55 req-feb1d091-ad14-408b-ac3b-7d4dd859ee4a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Releasing lock "refresh_cache-2bf13a3e-bb2a-45f0-893e-0eb33fedb85e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:42:58 np0005546954 nova_compute[187160]: 2025-12-05 12:42:58.398 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:42:58 np0005546954 nova_compute[187160]: 2025-12-05 12:42:58.398 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf9c0965a-86, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:42:58 np0005546954 nova_compute[187160]: 2025-12-05 12:42:58.399 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf9c0965a-86, col_values=(('external_ids', {'iface-id': 'f9c0965a-861e-4c24-9c97-679c6d706267', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:99:40:2c', 'vm-uuid': '2bf13a3e-bb2a-45f0-893e-0eb33fedb85e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:42:58 np0005546954 nova_compute[187160]: 2025-12-05 12:42:58.401 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:42:58 np0005546954 NetworkManager[55665]: <info>  [1764938578.4026] manager: (tapf9c0965a-86): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Dec  5 07:42:58 np0005546954 nova_compute[187160]: 2025-12-05 12:42:58.404 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:42:58 np0005546954 nova_compute[187160]: 2025-12-05 12:42:58.414 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:42:58 np0005546954 nova_compute[187160]: 2025-12-05 12:42:58.416 187164 INFO os_vif [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:99:40:2c,bridge_name='br-int',has_traffic_filtering=True,id=f9c0965a-861e-4c24-9c97-679c6d706267,network=Network(ee43e901-b158-4dc0-894f-2384aef8b277),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf9c0965a-86')#033[00m
Dec  5 07:42:58 np0005546954 nova_compute[187160]: 2025-12-05 12:42:58.479 187164 DEBUG nova.virt.libvirt.driver [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:42:58 np0005546954 nova_compute[187160]: 2025-12-05 12:42:58.480 187164 DEBUG nova.virt.libvirt.driver [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:42:58 np0005546954 nova_compute[187160]: 2025-12-05 12:42:58.480 187164 DEBUG nova.virt.libvirt.driver [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] No VIF found with MAC fa:16:3e:99:40:2c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:42:58 np0005546954 nova_compute[187160]: 2025-12-05 12:42:58.481 187164 INFO nova.virt.libvirt.driver [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Using config drive#033[00m
Dec  5 07:42:58 np0005546954 nova_compute[187160]: 2025-12-05 12:42:58.767 187164 INFO nova.virt.libvirt.driver [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Creating config drive at /var/lib/nova/instances/2bf13a3e-bb2a-45f0-893e-0eb33fedb85e/disk.config#033[00m
Dec  5 07:42:58 np0005546954 nova_compute[187160]: 2025-12-05 12:42:58.778 187164 DEBUG oslo_concurrency.processutils [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2bf13a3e-bb2a-45f0-893e-0eb33fedb85e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1oa_blqc execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:42:58 np0005546954 nova_compute[187160]: 2025-12-05 12:42:58.864 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:42:58 np0005546954 nova_compute[187160]: 2025-12-05 12:42:58.915 187164 DEBUG oslo_concurrency.processutils [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2bf13a3e-bb2a-45f0-893e-0eb33fedb85e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1oa_blqc" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:42:59 np0005546954 kernel: tun: Universal TUN/TAP device driver, 1.6
Dec  5 07:42:59 np0005546954 NetworkManager[55665]: <info>  [1764938579.0260] manager: (tapf9c0965a-86): new Tun device (/org/freedesktop/NetworkManager/Devices/21)
Dec  5 07:42:59 np0005546954 kernel: tapf9c0965a-86: entered promiscuous mode
Dec  5 07:42:59 np0005546954 ovn_controller[95566]: 2025-12-05T12:42:59Z|00027|binding|INFO|Claiming lport f9c0965a-861e-4c24-9c97-679c6d706267 for this chassis.
Dec  5 07:42:59 np0005546954 ovn_controller[95566]: 2025-12-05T12:42:59Z|00028|binding|INFO|f9c0965a-861e-4c24-9c97-679c6d706267: Claiming fa:16:3e:99:40:2c 10.100.0.12
Dec  5 07:42:59 np0005546954 nova_compute[187160]: 2025-12-05 12:42:59.028 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:42:59 np0005546954 nova_compute[187160]: 2025-12-05 12:42:59.037 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:42:59 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:42:59.048 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:99:40:2c 10.100.0.12'], port_security=['fa:16:3e:99:40:2c 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '2bf13a3e-bb2a-45f0-893e-0eb33fedb85e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee43e901-b158-4dc0-894f-2384aef8b277', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6b5f383ed0484ca1bde081bf623dad4b', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a865bea6-413e-4ecb-bace-2ec9005935f1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8c84cb49-52df-48a9-8d24-aff5b642e12a, chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=f9c0965a-861e-4c24-9c97-679c6d706267) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:42:59 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:42:59.052 104428 INFO neutron.agent.ovn.metadata.agent [-] Port f9c0965a-861e-4c24-9c97-679c6d706267 in datapath ee43e901-b158-4dc0-894f-2384aef8b277 bound to our chassis#033[00m
Dec  5 07:42:59 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:42:59.058 104428 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee43e901-b158-4dc0-894f-2384aef8b277#033[00m
Dec  5 07:42:59 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:42:59.061 104428 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpqv5l3nlp/privsep.sock']#033[00m
Dec  5 07:42:59 np0005546954 systemd-udevd[208670]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:42:59 np0005546954 NetworkManager[55665]: <info>  [1764938579.0933] device (tapf9c0965a-86): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:42:59 np0005546954 NetworkManager[55665]: <info>  [1764938579.0947] device (tapf9c0965a-86): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:42:59 np0005546954 systemd-machined[153497]: New machine qemu-1-instance-00000002.
Dec  5 07:42:59 np0005546954 ovn_controller[95566]: 2025-12-05T12:42:59Z|00029|binding|INFO|Setting lport f9c0965a-861e-4c24-9c97-679c6d706267 ovn-installed in OVS
Dec  5 07:42:59 np0005546954 ovn_controller[95566]: 2025-12-05T12:42:59Z|00030|binding|INFO|Setting lport f9c0965a-861e-4c24-9c97-679c6d706267 up in Southbound
Dec  5 07:42:59 np0005546954 nova_compute[187160]: 2025-12-05 12:42:59.169 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:42:59 np0005546954 systemd[1]: Started Virtual Machine qemu-1-instance-00000002.
Dec  5 07:42:59 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:42:59.771 104428 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Dec  5 07:42:59 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:42:59.774 104428 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpqv5l3nlp/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Dec  5 07:42:59 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:42:59.624 208690 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Dec  5 07:42:59 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:42:59.634 208690 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Dec  5 07:42:59 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:42:59.639 208690 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m
Dec  5 07:42:59 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:42:59.639 208690 INFO oslo.privsep.daemon [-] privsep daemon running as pid 208690#033[00m
Dec  5 07:42:59 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:42:59.779 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[b752989f-f752-48b7-9654-befd41b563f1]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:42:59 np0005546954 nova_compute[187160]: 2025-12-05 12:42:59.838 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764938579.8364608, 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:42:59 np0005546954 nova_compute[187160]: 2025-12-05 12:42:59.839 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] VM Started (Lifecycle Event)#033[00m
Dec  5 07:42:59 np0005546954 nova_compute[187160]: 2025-12-05 12:42:59.869 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:42:59 np0005546954 nova_compute[187160]: 2025-12-05 12:42:59.876 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764938579.8368797, 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:42:59 np0005546954 nova_compute[187160]: 2025-12-05 12:42:59.876 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:42:59 np0005546954 nova_compute[187160]: 2025-12-05 12:42:59.898 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:42:59 np0005546954 nova_compute[187160]: 2025-12-05 12:42:59.903 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:42:59 np0005546954 nova_compute[187160]: 2025-12-05 12:42:59.924 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:43:00 np0005546954 nova_compute[187160]: 2025-12-05 12:43:00.222 187164 DEBUG nova.compute.manager [req-9e19022e-c23e-48e7-a403-bec31120dc2a req-bb153aa4-23ec-4cc3-a662-d439084d9e79 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Received event network-vif-plugged-f9c0965a-861e-4c24-9c97-679c6d706267 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:43:00 np0005546954 nova_compute[187160]: 2025-12-05 12:43:00.222 187164 DEBUG oslo_concurrency.lockutils [req-9e19022e-c23e-48e7-a403-bec31120dc2a req-bb153aa4-23ec-4cc3-a662-d439084d9e79 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "2bf13a3e-bb2a-45f0-893e-0eb33fedb85e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:43:00 np0005546954 nova_compute[187160]: 2025-12-05 12:43:00.223 187164 DEBUG oslo_concurrency.lockutils [req-9e19022e-c23e-48e7-a403-bec31120dc2a req-bb153aa4-23ec-4cc3-a662-d439084d9e79 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "2bf13a3e-bb2a-45f0-893e-0eb33fedb85e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:43:00 np0005546954 nova_compute[187160]: 2025-12-05 12:43:00.223 187164 DEBUG oslo_concurrency.lockutils [req-9e19022e-c23e-48e7-a403-bec31120dc2a req-bb153aa4-23ec-4cc3-a662-d439084d9e79 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "2bf13a3e-bb2a-45f0-893e-0eb33fedb85e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:43:00 np0005546954 nova_compute[187160]: 2025-12-05 12:43:00.223 187164 DEBUG nova.compute.manager [req-9e19022e-c23e-48e7-a403-bec31120dc2a req-bb153aa4-23ec-4cc3-a662-d439084d9e79 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Processing event network-vif-plugged-f9c0965a-861e-4c24-9c97-679c6d706267 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:43:00 np0005546954 nova_compute[187160]: 2025-12-05 12:43:00.224 187164 DEBUG nova.compute.manager [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:43:00 np0005546954 nova_compute[187160]: 2025-12-05 12:43:00.227 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764938580.22751, 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:43:00 np0005546954 nova_compute[187160]: 2025-12-05 12:43:00.227 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:43:00 np0005546954 nova_compute[187160]: 2025-12-05 12:43:00.229 187164 DEBUG nova.virt.libvirt.driver [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:43:00 np0005546954 nova_compute[187160]: 2025-12-05 12:43:00.244 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:43:00 np0005546954 nova_compute[187160]: 2025-12-05 12:43:00.248 187164 INFO nova.virt.libvirt.driver [-] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Instance spawned successfully.#033[00m
Dec  5 07:43:00 np0005546954 nova_compute[187160]: 2025-12-05 12:43:00.249 187164 DEBUG nova.virt.libvirt.driver [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:43:00 np0005546954 nova_compute[187160]: 2025-12-05 12:43:00.250 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:43:00 np0005546954 nova_compute[187160]: 2025-12-05 12:43:00.270 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:43:00 np0005546954 nova_compute[187160]: 2025-12-05 12:43:00.274 187164 DEBUG nova.virt.libvirt.driver [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:43:00 np0005546954 nova_compute[187160]: 2025-12-05 12:43:00.274 187164 DEBUG nova.virt.libvirt.driver [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:43:00 np0005546954 nova_compute[187160]: 2025-12-05 12:43:00.275 187164 DEBUG nova.virt.libvirt.driver [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:43:00 np0005546954 nova_compute[187160]: 2025-12-05 12:43:00.275 187164 DEBUG nova.virt.libvirt.driver [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:43:00 np0005546954 nova_compute[187160]: 2025-12-05 12:43:00.275 187164 DEBUG nova.virt.libvirt.driver [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:43:00 np0005546954 nova_compute[187160]: 2025-12-05 12:43:00.276 187164 DEBUG nova.virt.libvirt.driver [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:43:00 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:00.337 208690 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:43:00 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:00.337 208690 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:43:00 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:00.337 208690 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:43:00 np0005546954 nova_compute[187160]: 2025-12-05 12:43:00.341 187164 INFO nova.compute.manager [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Took 6.97 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:43:00 np0005546954 nova_compute[187160]: 2025-12-05 12:43:00.342 187164 DEBUG nova.compute.manager [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:43:00 np0005546954 nova_compute[187160]: 2025-12-05 12:43:00.402 187164 INFO nova.compute.manager [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Took 7.86 seconds to build instance.#033[00m
Dec  5 07:43:00 np0005546954 nova_compute[187160]: 2025-12-05 12:43:00.420 187164 DEBUG oslo_concurrency.lockutils [None req-17354a02-0883-4025-9b71-7e0ed1ff8973 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "2bf13a3e-bb2a-45f0-893e-0eb33fedb85e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.944s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:43:00 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:00.904 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[5bb4b76a-470e-43c4-8068-27ca3e790c15]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:43:00 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:00.906 104428 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapee43e901-b1 in ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:43:00 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:00.909 208690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapee43e901-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:43:00 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:00.909 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[5446eb43-f0cd-4bbe-8ddd-d211852d7bed]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:43:00 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:00.912 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[ff059607-5162-4051-a059-ab3171feb1aa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:43:00 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:00.956 104542 DEBUG oslo.privsep.daemon [-] privsep: reply[c420adc4-ee28-47fa-9e32-df76709eb828]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:43:00 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:00.990 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[33a9fc2d-6c98-44f1-8b9e-db0c26cb7980]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:43:00 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:00.993 104428 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmp3t5kp7hm/privsep.sock']#033[00m
Dec  5 07:43:01 np0005546954 nova_compute[187160]: 2025-12-05 12:43:01.050 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:43:01 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:01.052 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2a:56:46', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '32:90:88:ab:74:32'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:43:01 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:01.665 104428 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Dec  5 07:43:01 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:01.665 104428 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp3t5kp7hm/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Dec  5 07:43:01 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:01.536 208711 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Dec  5 07:43:01 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:01.543 208711 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Dec  5 07:43:01 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:01.545 208711 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Dec  5 07:43:01 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:01.545 208711 INFO oslo.privsep.daemon [-] privsep daemon running as pid 208711#033[00m
Dec  5 07:43:01 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:01.668 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[682b18c9-5e04-4186-9865-b9d761c20428]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:43:02 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:02.129 208711 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:43:02 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:02.129 208711 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:43:02 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:02.129 208711 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:43:02 np0005546954 nova_compute[187160]: 2025-12-05 12:43:02.310 187164 DEBUG nova.compute.manager [req-d5e56735-c0a9-4802-8781-52d8e6d88f33 req-135cd3f5-8d15-417d-bbf5-ee97fc13f3d9 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Received event network-vif-plugged-f9c0965a-861e-4c24-9c97-679c6d706267 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:43:02 np0005546954 nova_compute[187160]: 2025-12-05 12:43:02.311 187164 DEBUG oslo_concurrency.lockutils [req-d5e56735-c0a9-4802-8781-52d8e6d88f33 req-135cd3f5-8d15-417d-bbf5-ee97fc13f3d9 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "2bf13a3e-bb2a-45f0-893e-0eb33fedb85e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:43:02 np0005546954 nova_compute[187160]: 2025-12-05 12:43:02.311 187164 DEBUG oslo_concurrency.lockutils [req-d5e56735-c0a9-4802-8781-52d8e6d88f33 req-135cd3f5-8d15-417d-bbf5-ee97fc13f3d9 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "2bf13a3e-bb2a-45f0-893e-0eb33fedb85e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:43:02 np0005546954 nova_compute[187160]: 2025-12-05 12:43:02.311 187164 DEBUG oslo_concurrency.lockutils [req-d5e56735-c0a9-4802-8781-52d8e6d88f33 req-135cd3f5-8d15-417d-bbf5-ee97fc13f3d9 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "2bf13a3e-bb2a-45f0-893e-0eb33fedb85e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:43:02 np0005546954 nova_compute[187160]: 2025-12-05 12:43:02.311 187164 DEBUG nova.compute.manager [req-d5e56735-c0a9-4802-8781-52d8e6d88f33 req-135cd3f5-8d15-417d-bbf5-ee97fc13f3d9 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] No waiting events found dispatching network-vif-plugged-f9c0965a-861e-4c24-9c97-679c6d706267 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:43:02 np0005546954 nova_compute[187160]: 2025-12-05 12:43:02.311 187164 WARNING nova.compute.manager [req-d5e56735-c0a9-4802-8781-52d8e6d88f33 req-135cd3f5-8d15-417d-bbf5-ee97fc13f3d9 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Received unexpected event network-vif-plugged-f9c0965a-861e-4c24-9c97-679c6d706267 for instance with vm_state active and task_state None.#033[00m
Dec  5 07:43:02 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:02.711 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[6d83bc71-f484-4616-8bd8-7e8ccffe5353]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:43:02 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:02.741 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[3e38c3fb-99de-4671-96c9-552ee4cc5097]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:43:02 np0005546954 NetworkManager[55665]: <info>  [1764938582.7452] manager: (tapee43e901-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/22)
Dec  5 07:43:02 np0005546954 systemd-udevd[208723]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:43:02 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:02.785 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[a5333a2f-2c14-412a-a088-7471747c5308]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:43:02 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:02.788 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[6015d3b9-0791-4699-8ec8-2daf796793d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:43:02 np0005546954 NetworkManager[55665]: <info>  [1764938582.8304] device (tapee43e901-b0): carrier: link connected
Dec  5 07:43:02 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:02.841 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[cb2767f3-5039-4227-a045-2be4f49daf9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:43:02 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:02.869 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[754dd397-7202-47a3-b69d-e9ccf130a67e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee43e901-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:16:af:4f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 358945, 'reachable_time': 33712, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 208741, 'error': None, 'target': 'ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:43:02 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:02.898 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[18ff6f7b-83cc-470c-91cc-17036415d438]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe16:af4f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 358945, 'tstamp': 358945}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 208742, 'error': None, 'target': 'ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:43:02 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:02.928 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[638b7151-ebf1-45a8-9d01-4763574fb5cd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee43e901-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:16:af:4f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 358945, 'reachable_time': 33712, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 208743, 'error': None, 'target': 'ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:43:02 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:02.976 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[85576483-7834-406f-bc72-a5a28f584468]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:43:03 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:03.064 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[27a94e6e-c607-49d4-8b1b-09ac9c531636]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:43:03 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:03.067 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee43e901-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:43:03 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:03.068 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:43:03 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:03.069 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee43e901-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:43:03 np0005546954 kernel: tapee43e901-b0: entered promiscuous mode
Dec  5 07:43:03 np0005546954 NetworkManager[55665]: <info>  [1764938583.0733] manager: (tapee43e901-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/23)
Dec  5 07:43:03 np0005546954 nova_compute[187160]: 2025-12-05 12:43:03.072 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:43:03 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:03.078 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee43e901-b0, col_values=(('external_ids', {'iface-id': 'ff42a43f-b4ac-4be3-b747-f3c0a6e67328'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:43:03 np0005546954 nova_compute[187160]: 2025-12-05 12:43:03.080 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:43:03 np0005546954 ovn_controller[95566]: 2025-12-05T12:43:03Z|00031|binding|INFO|Releasing lport ff42a43f-b4ac-4be3-b747-f3c0a6e67328 from this chassis (sb_readonly=0)
Dec  5 07:43:03 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:03.084 104428 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ee43e901-b158-4dc0-894f-2384aef8b277.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ee43e901-b158-4dc0-894f-2384aef8b277.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:43:03 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:03.086 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[bd10cf1a-a1fa-4576-9956-268e09bd9fab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:43:03 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:03.088 104428 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:43:03 np0005546954 ovn_metadata_agent[104423]: global
Dec  5 07:43:03 np0005546954 ovn_metadata_agent[104423]:    log         /dev/log local0 debug
Dec  5 07:43:03 np0005546954 ovn_metadata_agent[104423]:    log-tag     haproxy-metadata-proxy-ee43e901-b158-4dc0-894f-2384aef8b277
Dec  5 07:43:03 np0005546954 ovn_metadata_agent[104423]:    user        root
Dec  5 07:43:03 np0005546954 ovn_metadata_agent[104423]:    group       root
Dec  5 07:43:03 np0005546954 ovn_metadata_agent[104423]:    maxconn     1024
Dec  5 07:43:03 np0005546954 ovn_metadata_agent[104423]:    pidfile     /var/lib/neutron/external/pids/ee43e901-b158-4dc0-894f-2384aef8b277.pid.haproxy
Dec  5 07:43:03 np0005546954 ovn_metadata_agent[104423]:    daemon
Dec  5 07:43:03 np0005546954 ovn_metadata_agent[104423]: 
Dec  5 07:43:03 np0005546954 ovn_metadata_agent[104423]: defaults
Dec  5 07:43:03 np0005546954 ovn_metadata_agent[104423]:    log global
Dec  5 07:43:03 np0005546954 ovn_metadata_agent[104423]:    mode http
Dec  5 07:43:03 np0005546954 ovn_metadata_agent[104423]:    option httplog
Dec  5 07:43:03 np0005546954 ovn_metadata_agent[104423]:    option dontlognull
Dec  5 07:43:03 np0005546954 ovn_metadata_agent[104423]:    option http-server-close
Dec  5 07:43:03 np0005546954 ovn_metadata_agent[104423]:    option forwardfor
Dec  5 07:43:03 np0005546954 ovn_metadata_agent[104423]:    retries                 3
Dec  5 07:43:03 np0005546954 ovn_metadata_agent[104423]:    timeout http-request    30s
Dec  5 07:43:03 np0005546954 ovn_metadata_agent[104423]:    timeout connect         30s
Dec  5 07:43:03 np0005546954 ovn_metadata_agent[104423]:    timeout client          32s
Dec  5 07:43:03 np0005546954 ovn_metadata_agent[104423]:    timeout server          32s
Dec  5 07:43:03 np0005546954 ovn_metadata_agent[104423]:    timeout http-keep-alive 30s
Dec  5 07:43:03 np0005546954 ovn_metadata_agent[104423]: 
Dec  5 07:43:03 np0005546954 ovn_metadata_agent[104423]: 
Dec  5 07:43:03 np0005546954 ovn_metadata_agent[104423]: listen listener
Dec  5 07:43:03 np0005546954 ovn_metadata_agent[104423]:    bind 169.254.169.254:80
Dec  5 07:43:03 np0005546954 ovn_metadata_agent[104423]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:43:03 np0005546954 ovn_metadata_agent[104423]:    http-request add-header X-OVN-Network-ID ee43e901-b158-4dc0-894f-2384aef8b277
Dec  5 07:43:03 np0005546954 ovn_metadata_agent[104423]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:43:03 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:03.090 104428 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277', 'env', 'PROCESS_TAG=haproxy-ee43e901-b158-4dc0-894f-2384aef8b277', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ee43e901-b158-4dc0-894f-2384aef8b277.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:43:03 np0005546954 nova_compute[187160]: 2025-12-05 12:43:03.094 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:43:03 np0005546954 nova_compute[187160]: 2025-12-05 12:43:03.401 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:43:03 np0005546954 podman[208776]: 2025-12-05 12:43:03.461533016 +0000 UTC m=+0.043692142 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:43:03 np0005546954 nova_compute[187160]: 2025-12-05 12:43:03.867 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:43:03 np0005546954 podman[208776]: 2025-12-05 12:43:03.985269715 +0000 UTC m=+0.567428771 container create b918bb8b297a48db8d4d39e3d1f420e37096597f6c6c4d3cfaed59f07b0915bc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true)
Dec  5 07:43:04 np0005546954 systemd[1]: Started libpod-conmon-b918bb8b297a48db8d4d39e3d1f420e37096597f6c6c4d3cfaed59f07b0915bc.scope.
Dec  5 07:43:04 np0005546954 podman[208789]: 2025-12-05 12:43:04.1842854 +0000 UTC m=+0.157683111 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec  5 07:43:04 np0005546954 systemd[1]: Started libcrun container.
Dec  5 07:43:04 np0005546954 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3dddda690aabd64c6917a822df0daded70952bb8bf0dac5f394abe51f6444385/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:43:04 np0005546954 podman[208776]: 2025-12-05 12:43:04.240741472 +0000 UTC m=+0.822900618 container init b918bb8b297a48db8d4d39e3d1f420e37096597f6c6c4d3cfaed59f07b0915bc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:43:04 np0005546954 podman[208776]: 2025-12-05 12:43:04.248761216 +0000 UTC m=+0.830920292 container start b918bb8b297a48db8d4d39e3d1f420e37096597f6c6c4d3cfaed59f07b0915bc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:43:04 np0005546954 neutron-haproxy-ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277[208811]: [NOTICE]   (208817) : New worker (208819) forked
Dec  5 07:43:04 np0005546954 neutron-haproxy-ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277[208811]: [NOTICE]   (208817) : Loading success.
Dec  5 07:43:04 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:04.328 104428 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 07:43:05 np0005546954 podman[197513]: time="2025-12-05T12:43:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:43:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:43:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Dec  5 07:43:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:43:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3027 "" "Go-http-client/1.1"
Dec  5 07:43:08 np0005546954 nova_compute[187160]: 2025-12-05 12:43:08.405 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:43:08 np0005546954 podman[208828]: 2025-12-05 12:43:08.565701012 +0000 UTC m=+0.073560444 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 07:43:08 np0005546954 nova_compute[187160]: 2025-12-05 12:43:08.910 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:43:09 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:09.330 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f9f74c-08f9-451f-9678-93bb9e8fa2fe, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:43:10 np0005546954 podman[208854]: 2025-12-05 12:43:10.598765074 +0000 UTC m=+0.097978165 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:43:12 np0005546954 ovn_controller[95566]: 2025-12-05T12:43:12Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:99:40:2c 10.100.0.12
Dec  5 07:43:12 np0005546954 ovn_controller[95566]: 2025-12-05T12:43:12Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:99:40:2c 10.100.0.12
Dec  5 07:43:13 np0005546954 nova_compute[187160]: 2025-12-05 12:43:13.408 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:43:13 np0005546954 nova_compute[187160]: 2025-12-05 12:43:13.913 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:43:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:16.939 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:43:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:16.940 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:43:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:16.941 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:43:18 np0005546954 nova_compute[187160]: 2025-12-05 12:43:18.411 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:43:18 np0005546954 nova_compute[187160]: 2025-12-05 12:43:18.915 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:43:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:43:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 07:43:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:43:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:43:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:43:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:43:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:43:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 07:43:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:43:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:43:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 07:43:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:43:21 np0005546954 podman[208899]: 2025-12-05 12:43:21.567625012 +0000 UTC m=+0.072477869 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true)
Dec  5 07:43:21 np0005546954 podman[208898]: 2025-12-05 12:43:21.572457866 +0000 UTC m=+0.078406187 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.openshift.expose-services=, vendor=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm)
Dec  5 07:43:23 np0005546954 nova_compute[187160]: 2025-12-05 12:43:23.414 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:43:23 np0005546954 nova_compute[187160]: 2025-12-05 12:43:23.918 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:43:28 np0005546954 nova_compute[187160]: 2025-12-05 12:43:28.417 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:43:28 np0005546954 nova_compute[187160]: 2025-12-05 12:43:28.954 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:43:33 np0005546954 ovn_controller[95566]: 2025-12-05T12:43:33Z|00032|memory_trim|INFO|Detected inactivity (last active 30012 ms ago): trimming memory
Dec  5 07:43:33 np0005546954 nova_compute[187160]: 2025-12-05 12:43:33.422 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:43:33 np0005546954 nova_compute[187160]: 2025-12-05 12:43:33.957 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:43:34 np0005546954 podman[208937]: 2025-12-05 12:43:34.567661954 +0000 UTC m=+0.073090789 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec  5 07:43:35 np0005546954 podman[197513]: time="2025-12-05T12:43:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:43:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:43:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Dec  5 07:43:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:43:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3033 "" "Go-http-client/1.1"
Dec  5 07:43:38 np0005546954 nova_compute[187160]: 2025-12-05 12:43:38.424 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:43:38 np0005546954 nova_compute[187160]: 2025-12-05 12:43:38.959 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:43:39 np0005546954 podman[208957]: 2025-12-05 12:43:39.549426056 +0000 UTC m=+0.055427111 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 07:43:41 np0005546954 podman[208982]: 2025-12-05 12:43:41.637456684 +0000 UTC m=+0.141203529 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:43:43 np0005546954 nova_compute[187160]: 2025-12-05 12:43:43.426 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:43:43 np0005546954 nova_compute[187160]: 2025-12-05 12:43:43.962 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:43:46 np0005546954 nova_compute[187160]: 2025-12-05 12:43:46.454 187164 DEBUG oslo_concurrency.lockutils [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Acquiring lock "0513a02c-7fe2-43aa-9bd6-020014460672" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:43:46 np0005546954 nova_compute[187160]: 2025-12-05 12:43:46.455 187164 DEBUG oslo_concurrency.lockutils [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "0513a02c-7fe2-43aa-9bd6-020014460672" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:43:46 np0005546954 nova_compute[187160]: 2025-12-05 12:43:46.475 187164 DEBUG nova.compute.manager [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:43:46 np0005546954 nova_compute[187160]: 2025-12-05 12:43:46.564 187164 DEBUG oslo_concurrency.lockutils [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:43:46 np0005546954 nova_compute[187160]: 2025-12-05 12:43:46.565 187164 DEBUG oslo_concurrency.lockutils [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:43:46 np0005546954 nova_compute[187160]: 2025-12-05 12:43:46.574 187164 DEBUG nova.virt.hardware [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:43:46 np0005546954 nova_compute[187160]: 2025-12-05 12:43:46.575 187164 INFO nova.compute.claims [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Claim successful on node compute-1.ctlplane.example.com#033[00m
Dec  5 07:43:46 np0005546954 nova_compute[187160]: 2025-12-05 12:43:46.718 187164 DEBUG nova.compute.provider_tree [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Updating inventory in ProviderTree for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b with inventory: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  5 07:43:46 np0005546954 nova_compute[187160]: 2025-12-05 12:43:46.751 187164 ERROR nova.scheduler.client.report [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [req-65025021-27a2-44bc-98cf-4a50586595cc] Failed to update inventory to [{'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-65025021-27a2-44bc-98cf-4a50586595cc"}]}#033[00m
Dec  5 07:43:46 np0005546954 nova_compute[187160]: 2025-12-05 12:43:46.769 187164 DEBUG nova.scheduler.client.report [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Refreshing inventories for resource provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  5 07:43:46 np0005546954 nova_compute[187160]: 2025-12-05 12:43:46.791 187164 DEBUG nova.scheduler.client.report [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Updating ProviderTree inventory for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  5 07:43:46 np0005546954 nova_compute[187160]: 2025-12-05 12:43:46.792 187164 DEBUG nova.compute.provider_tree [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Updating inventory in ProviderTree for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  5 07:43:46 np0005546954 nova_compute[187160]: 2025-12-05 12:43:46.806 187164 DEBUG nova.scheduler.client.report [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Refreshing aggregate associations for resource provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  5 07:43:46 np0005546954 nova_compute[187160]: 2025-12-05 12:43:46.826 187164 DEBUG nova.scheduler.client.report [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Refreshing trait associations for resource provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b, traits: COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_IDE,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_2_0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  5 07:43:46 np0005546954 nova_compute[187160]: 2025-12-05 12:43:46.901 187164 DEBUG nova.compute.provider_tree [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Updating inventory in ProviderTree for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b with inventory: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  5 07:43:46 np0005546954 nova_compute[187160]: 2025-12-05 12:43:46.941 187164 DEBUG nova.scheduler.client.report [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Updated inventory for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b with generation 4 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Dec  5 07:43:46 np0005546954 nova_compute[187160]: 2025-12-05 12:43:46.942 187164 DEBUG nova.compute.provider_tree [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Updating resource provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b generation from 4 to 5 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Dec  5 07:43:46 np0005546954 nova_compute[187160]: 2025-12-05 12:43:46.942 187164 DEBUG nova.compute.provider_tree [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Updating inventory in ProviderTree for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b with inventory: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  5 07:43:46 np0005546954 nova_compute[187160]: 2025-12-05 12:43:46.964 187164 DEBUG oslo_concurrency.lockutils [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.399s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:43:46 np0005546954 nova_compute[187160]: 2025-12-05 12:43:46.965 187164 DEBUG nova.compute.manager [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:43:47 np0005546954 nova_compute[187160]: 2025-12-05 12:43:47.015 187164 DEBUG nova.compute.manager [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:43:47 np0005546954 nova_compute[187160]: 2025-12-05 12:43:47.016 187164 DEBUG nova.network.neutron [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:43:47 np0005546954 nova_compute[187160]: 2025-12-05 12:43:47.037 187164 INFO nova.virt.libvirt.driver [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:43:47 np0005546954 nova_compute[187160]: 2025-12-05 12:43:47.056 187164 DEBUG nova.compute.manager [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:43:47 np0005546954 nova_compute[187160]: 2025-12-05 12:43:47.148 187164 DEBUG nova.compute.manager [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:43:47 np0005546954 nova_compute[187160]: 2025-12-05 12:43:47.150 187164 DEBUG nova.virt.libvirt.driver [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:43:47 np0005546954 nova_compute[187160]: 2025-12-05 12:43:47.151 187164 INFO nova.virt.libvirt.driver [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Creating image(s)#033[00m
Dec  5 07:43:47 np0005546954 nova_compute[187160]: 2025-12-05 12:43:47.152 187164 DEBUG oslo_concurrency.lockutils [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Acquiring lock "/var/lib/nova/instances/0513a02c-7fe2-43aa-9bd6-020014460672/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:43:47 np0005546954 nova_compute[187160]: 2025-12-05 12:43:47.153 187164 DEBUG oslo_concurrency.lockutils [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "/var/lib/nova/instances/0513a02c-7fe2-43aa-9bd6-020014460672/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:43:47 np0005546954 nova_compute[187160]: 2025-12-05 12:43:47.154 187164 DEBUG oslo_concurrency.lockutils [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "/var/lib/nova/instances/0513a02c-7fe2-43aa-9bd6-020014460672/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:43:47 np0005546954 nova_compute[187160]: 2025-12-05 12:43:47.179 187164 DEBUG oslo_concurrency.processutils [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:43:47 np0005546954 nova_compute[187160]: 2025-12-05 12:43:47.272 187164 DEBUG oslo_concurrency.processutils [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:43:47 np0005546954 nova_compute[187160]: 2025-12-05 12:43:47.274 187164 DEBUG oslo_concurrency.lockutils [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Acquiring lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:43:47 np0005546954 nova_compute[187160]: 2025-12-05 12:43:47.275 187164 DEBUG oslo_concurrency.lockutils [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:43:47 np0005546954 nova_compute[187160]: 2025-12-05 12:43:47.298 187164 DEBUG oslo_concurrency.processutils [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:43:47 np0005546954 nova_compute[187160]: 2025-12-05 12:43:47.365 187164 DEBUG oslo_concurrency.processutils [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:43:47 np0005546954 nova_compute[187160]: 2025-12-05 12:43:47.367 187164 DEBUG oslo_concurrency.processutils [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/0513a02c-7fe2-43aa-9bd6-020014460672/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:43:47 np0005546954 nova_compute[187160]: 2025-12-05 12:43:47.797 187164 DEBUG oslo_concurrency.processutils [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/0513a02c-7fe2-43aa-9bd6-020014460672/disk 1073741824" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:43:47 np0005546954 nova_compute[187160]: 2025-12-05 12:43:47.799 187164 DEBUG oslo_concurrency.lockutils [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.524s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:43:47 np0005546954 nova_compute[187160]: 2025-12-05 12:43:47.800 187164 DEBUG oslo_concurrency.processutils [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:43:47 np0005546954 nova_compute[187160]: 2025-12-05 12:43:47.858 187164 DEBUG oslo_concurrency.processutils [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:43:47 np0005546954 nova_compute[187160]: 2025-12-05 12:43:47.861 187164 DEBUG nova.virt.disk.api [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Checking if we can resize image /var/lib/nova/instances/0513a02c-7fe2-43aa-9bd6-020014460672/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:43:47 np0005546954 nova_compute[187160]: 2025-12-05 12:43:47.862 187164 DEBUG oslo_concurrency.processutils [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0513a02c-7fe2-43aa-9bd6-020014460672/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:43:47 np0005546954 nova_compute[187160]: 2025-12-05 12:43:47.924 187164 DEBUG oslo_concurrency.processutils [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0513a02c-7fe2-43aa-9bd6-020014460672/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:43:47 np0005546954 nova_compute[187160]: 2025-12-05 12:43:47.926 187164 DEBUG nova.virt.disk.api [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Cannot resize image /var/lib/nova/instances/0513a02c-7fe2-43aa-9bd6-020014460672/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:43:47 np0005546954 nova_compute[187160]: 2025-12-05 12:43:47.927 187164 DEBUG nova.objects.instance [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lazy-loading 'migration_context' on Instance uuid 0513a02c-7fe2-43aa-9bd6-020014460672 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:43:47 np0005546954 nova_compute[187160]: 2025-12-05 12:43:47.946 187164 DEBUG nova.virt.libvirt.driver [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:43:47 np0005546954 nova_compute[187160]: 2025-12-05 12:43:47.947 187164 DEBUG nova.virt.libvirt.driver [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Ensure instance console log exists: /var/lib/nova/instances/0513a02c-7fe2-43aa-9bd6-020014460672/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:43:47 np0005546954 nova_compute[187160]: 2025-12-05 12:43:47.947 187164 DEBUG oslo_concurrency.lockutils [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:43:47 np0005546954 nova_compute[187160]: 2025-12-05 12:43:47.948 187164 DEBUG oslo_concurrency.lockutils [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:43:47 np0005546954 nova_compute[187160]: 2025-12-05 12:43:47.948 187164 DEBUG oslo_concurrency.lockutils [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:43:48 np0005546954 nova_compute[187160]: 2025-12-05 12:43:48.144 187164 DEBUG nova.policy [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7ce7ef64754e4a32b4af3272e31a4a5e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6b5f383ed0484ca1bde081bf623dad4b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:43:48 np0005546954 nova_compute[187160]: 2025-12-05 12:43:48.429 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:43:49 np0005546954 nova_compute[187160]: 2025-12-05 12:43:49.015 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:43:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:43:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:43:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:43:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 07:43:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:43:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:43:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:43:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 07:43:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:43:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:43:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 07:43:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:43:50 np0005546954 nova_compute[187160]: 2025-12-05 12:43:50.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:43:50 np0005546954 nova_compute[187160]: 2025-12-05 12:43:50.174 187164 DEBUG nova.network.neutron [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Successfully created port: 0f6a798c-c13a-409c-8274-1b8ad42ad19b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:43:51 np0005546954 nova_compute[187160]: 2025-12-05 12:43:51.041 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:43:51 np0005546954 nova_compute[187160]: 2025-12-05 12:43:51.042 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:43:51 np0005546954 nova_compute[187160]: 2025-12-05 12:43:51.042 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:43:51 np0005546954 nova_compute[187160]: 2025-12-05 12:43:51.067 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Dec  5 07:43:51 np0005546954 nova_compute[187160]: 2025-12-05 12:43:51.234 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "refresh_cache-2bf13a3e-bb2a-45f0-893e-0eb33fedb85e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:43:51 np0005546954 nova_compute[187160]: 2025-12-05 12:43:51.235 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquired lock "refresh_cache-2bf13a3e-bb2a-45f0-893e-0eb33fedb85e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:43:51 np0005546954 nova_compute[187160]: 2025-12-05 12:43:51.236 187164 DEBUG nova.network.neutron [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec  5 07:43:51 np0005546954 nova_compute[187160]: 2025-12-05 12:43:51.236 187164 DEBUG nova.objects.instance [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:43:52 np0005546954 podman[209024]: 2025-12-05 12:43:52.564335778 +0000 UTC m=+0.069686702 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=minimal rhel9, distribution-scope=public, io.buildah.version=1.33.7, architecture=x86_64, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, release=1755695350)
Dec  5 07:43:52 np0005546954 podman[209025]: 2025-12-05 12:43:52.579000771 +0000 UTC m=+0.088447863 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec  5 07:43:53 np0005546954 nova_compute[187160]: 2025-12-05 12:43:53.137 187164 DEBUG nova.network.neutron [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Successfully updated port: 0f6a798c-c13a-409c-8274-1b8ad42ad19b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:43:53 np0005546954 nova_compute[187160]: 2025-12-05 12:43:53.166 187164 DEBUG oslo_concurrency.lockutils [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Acquiring lock "refresh_cache-0513a02c-7fe2-43aa-9bd6-020014460672" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:43:53 np0005546954 nova_compute[187160]: 2025-12-05 12:43:53.167 187164 DEBUG oslo_concurrency.lockutils [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Acquired lock "refresh_cache-0513a02c-7fe2-43aa-9bd6-020014460672" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:43:53 np0005546954 nova_compute[187160]: 2025-12-05 12:43:53.168 187164 DEBUG nova.network.neutron [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:43:53 np0005546954 nova_compute[187160]: 2025-12-05 12:43:53.248 187164 DEBUG nova.compute.manager [req-e070cee0-518a-4a58-8591-9a324e83092c req-e8338cac-8c8f-4bbb-8b60-616cfa1670cc 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Received event network-changed-0f6a798c-c13a-409c-8274-1b8ad42ad19b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:43:53 np0005546954 nova_compute[187160]: 2025-12-05 12:43:53.249 187164 DEBUG nova.compute.manager [req-e070cee0-518a-4a58-8591-9a324e83092c req-e8338cac-8c8f-4bbb-8b60-616cfa1670cc 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Refreshing instance network info cache due to event network-changed-0f6a798c-c13a-409c-8274-1b8ad42ad19b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:43:53 np0005546954 nova_compute[187160]: 2025-12-05 12:43:53.249 187164 DEBUG oslo_concurrency.lockutils [req-e070cee0-518a-4a58-8591-9a324e83092c req-e8338cac-8c8f-4bbb-8b60-616cfa1670cc 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "refresh_cache-0513a02c-7fe2-43aa-9bd6-020014460672" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:43:53 np0005546954 nova_compute[187160]: 2025-12-05 12:43:53.432 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:43:54 np0005546954 nova_compute[187160]: 2025-12-05 12:43:54.017 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:43:54 np0005546954 nova_compute[187160]: 2025-12-05 12:43:54.142 187164 DEBUG nova.network.neutron [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:43:55 np0005546954 nova_compute[187160]: 2025-12-05 12:43:55.842 187164 DEBUG nova.network.neutron [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Updating instance_info_cache with network_info: [{"id": "f9c0965a-861e-4c24-9c97-679c6d706267", "address": "fa:16:3e:99:40:2c", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9c0965a-86", "ovs_interfaceid": "f9c0965a-861e-4c24-9c97-679c6d706267", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:43:55 np0005546954 nova_compute[187160]: 2025-12-05 12:43:55.873 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Releasing lock "refresh_cache-2bf13a3e-bb2a-45f0-893e-0eb33fedb85e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:43:55 np0005546954 nova_compute[187160]: 2025-12-05 12:43:55.874 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec  5 07:43:55 np0005546954 nova_compute[187160]: 2025-12-05 12:43:55.874 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:43:55 np0005546954 nova_compute[187160]: 2025-12-05 12:43:55.875 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:43:55 np0005546954 nova_compute[187160]: 2025-12-05 12:43:55.875 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:43:55 np0005546954 nova_compute[187160]: 2025-12-05 12:43:55.875 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:43:55 np0005546954 nova_compute[187160]: 2025-12-05 12:43:55.878 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:43:56 np0005546954 nova_compute[187160]: 2025-12-05 12:43:56.179 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:43:56 np0005546954 nova_compute[187160]: 2025-12-05 12:43:56.180 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:43:56 np0005546954 nova_compute[187160]: 2025-12-05 12:43:56.181 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:43:56 np0005546954 nova_compute[187160]: 2025-12-05 12:43:56.182 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:43:56 np0005546954 nova_compute[187160]: 2025-12-05 12:43:56.270 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2bf13a3e-bb2a-45f0-893e-0eb33fedb85e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:43:56 np0005546954 nova_compute[187160]: 2025-12-05 12:43:56.351 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2bf13a3e-bb2a-45f0-893e-0eb33fedb85e/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:43:56 np0005546954 nova_compute[187160]: 2025-12-05 12:43:56.353 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2bf13a3e-bb2a-45f0-893e-0eb33fedb85e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:43:56 np0005546954 nova_compute[187160]: 2025-12-05 12:43:56.422 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2bf13a3e-bb2a-45f0-893e-0eb33fedb85e/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:43:56 np0005546954 nova_compute[187160]: 2025-12-05 12:43:56.569 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:43:56 np0005546954 nova_compute[187160]: 2025-12-05 12:43:56.570 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5747MB free_disk=73.31147766113281GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:43:56 np0005546954 nova_compute[187160]: 2025-12-05 12:43:56.570 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:43:56 np0005546954 nova_compute[187160]: 2025-12-05 12:43:56.571 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:43:56 np0005546954 nova_compute[187160]: 2025-12-05 12:43:56.672 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Instance 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:43:56 np0005546954 nova_compute[187160]: 2025-12-05 12:43:56.673 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Instance 0513a02c-7fe2-43aa-9bd6-020014460672 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:43:56 np0005546954 nova_compute[187160]: 2025-12-05 12:43:56.673 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:43:56 np0005546954 nova_compute[187160]: 2025-12-05 12:43:56.673 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:43:56 np0005546954 nova_compute[187160]: 2025-12-05 12:43:56.744 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:43:56 np0005546954 nova_compute[187160]: 2025-12-05 12:43:56.762 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:43:56 np0005546954 nova_compute[187160]: 2025-12-05 12:43:56.784 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:43:56 np0005546954 nova_compute[187160]: 2025-12-05 12:43:56.785 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.214s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.178 187164 DEBUG nova.network.neutron [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Updating instance_info_cache with network_info: [{"id": "0f6a798c-c13a-409c-8274-1b8ad42ad19b", "address": "fa:16:3e:1f:8f:94", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f6a798c-c1", "ovs_interfaceid": "0f6a798c-c13a-409c-8274-1b8ad42ad19b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.217 187164 DEBUG oslo_concurrency.lockutils [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Releasing lock "refresh_cache-0513a02c-7fe2-43aa-9bd6-020014460672" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.218 187164 DEBUG nova.compute.manager [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Instance network_info: |[{"id": "0f6a798c-c13a-409c-8274-1b8ad42ad19b", "address": "fa:16:3e:1f:8f:94", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f6a798c-c1", "ovs_interfaceid": "0f6a798c-c13a-409c-8274-1b8ad42ad19b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.218 187164 DEBUG oslo_concurrency.lockutils [req-e070cee0-518a-4a58-8591-9a324e83092c req-e8338cac-8c8f-4bbb-8b60-616cfa1670cc 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquired lock "refresh_cache-0513a02c-7fe2-43aa-9bd6-020014460672" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.219 187164 DEBUG nova.network.neutron [req-e070cee0-518a-4a58-8591-9a324e83092c req-e8338cac-8c8f-4bbb-8b60-616cfa1670cc 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Refreshing network info cache for port 0f6a798c-c13a-409c-8274-1b8ad42ad19b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.222 187164 DEBUG nova.virt.libvirt.driver [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Start _get_guest_xml network_info=[{"id": "0f6a798c-c13a-409c-8274-1b8ad42ad19b", "address": "fa:16:3e:1f:8f:94", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f6a798c-c1", "ovs_interfaceid": "0f6a798c-c13a-409c-8274-1b8ad42ad19b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T12:39:17Z,direct_url=<?>,disk_format='qcow2',id=f4c3125a-6fd0-40bb-aa00-a7e736ee853d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='83916c53de6f404f91206339303e1b23',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T12:39:21Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'size': 0, 'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'encrypted': False, 'image_id': 'f4c3125a-6fd0-40bb-aa00-a7e736ee853d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.225 187164 WARNING nova.virt.libvirt.driver [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.229 187164 DEBUG nova.virt.libvirt.host [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.230 187164 DEBUG nova.virt.libvirt.host [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.232 187164 DEBUG nova.virt.libvirt.host [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.233 187164 DEBUG nova.virt.libvirt.host [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.234 187164 DEBUG nova.virt.libvirt.driver [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.234 187164 DEBUG nova.virt.hardware [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T12:39:15Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b4ea63be-97f8-4a48-b000-66321c4ddb27',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T12:39:17Z,direct_url=<?>,disk_format='qcow2',id=f4c3125a-6fd0-40bb-aa00-a7e736ee853d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='83916c53de6f404f91206339303e1b23',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T12:39:21Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.235 187164 DEBUG nova.virt.hardware [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.235 187164 DEBUG nova.virt.hardware [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.235 187164 DEBUG nova.virt.hardware [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.235 187164 DEBUG nova.virt.hardware [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.235 187164 DEBUG nova.virt.hardware [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.236 187164 DEBUG nova.virt.hardware [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.236 187164 DEBUG nova.virt.hardware [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.236 187164 DEBUG nova.virt.hardware [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.236 187164 DEBUG nova.virt.hardware [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.237 187164 DEBUG nova.virt.hardware [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.240 187164 DEBUG nova.virt.libvirt.vif [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:43:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1665104635',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1665104635',id=4,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6b5f383ed0484ca1bde081bf623dad4b',ramdisk_id='',reservation_id='r-3hvxvigf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1570363089',owner_user_name='tempest-TestExecuteActionsViaActuator-1570363089-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:43:47Z,user_data=None,user_id='7ce7ef64754e4a32b4af3272e31a4a5e',uuid=0513a02c-7fe2-43aa-9bd6-020014460672,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0f6a798c-c13a-409c-8274-1b8ad42ad19b", "address": "fa:16:3e:1f:8f:94", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f6a798c-c1", "ovs_interfaceid": "0f6a798c-c13a-409c-8274-1b8ad42ad19b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.240 187164 DEBUG nova.network.os_vif_util [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Converting VIF {"id": "0f6a798c-c13a-409c-8274-1b8ad42ad19b", "address": "fa:16:3e:1f:8f:94", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f6a798c-c1", "ovs_interfaceid": "0f6a798c-c13a-409c-8274-1b8ad42ad19b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.241 187164 DEBUG nova.network.os_vif_util [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1f:8f:94,bridge_name='br-int',has_traffic_filtering=True,id=0f6a798c-c13a-409c-8274-1b8ad42ad19b,network=Network(ee43e901-b158-4dc0-894f-2384aef8b277),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0f6a798c-c1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.241 187164 DEBUG nova.objects.instance [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lazy-loading 'pci_devices' on Instance uuid 0513a02c-7fe2-43aa-9bd6-020014460672 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.335 187164 DEBUG nova.virt.libvirt.driver [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:43:57 np0005546954 nova_compute[187160]:  <uuid>0513a02c-7fe2-43aa-9bd6-020014460672</uuid>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:  <name>instance-00000004</name>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:  <memory>131072</memory>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:  <vcpu>1</vcpu>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:  <metadata>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:43:57 np0005546954 nova_compute[187160]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:      <nova:name>tempest-TestExecuteActionsViaActuator-server-1665104635</nova:name>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:      <nova:creationTime>2025-12-05 12:43:57</nova:creationTime>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:      <nova:flavor name="m1.nano">
Dec  5 07:43:57 np0005546954 nova_compute[187160]:        <nova:memory>128</nova:memory>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:        <nova:disk>1</nova:disk>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:        <nova:swap>0</nova:swap>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:      </nova:flavor>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:      <nova:owner>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:        <nova:user uuid="7ce7ef64754e4a32b4af3272e31a4a5e">tempest-TestExecuteActionsViaActuator-1570363089-project-member</nova:user>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:        <nova:project uuid="6b5f383ed0484ca1bde081bf623dad4b">tempest-TestExecuteActionsViaActuator-1570363089</nova:project>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:      </nova:owner>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:      <nova:root type="image" uuid="f4c3125a-6fd0-40bb-aa00-a7e736ee853d"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:      <nova:ports>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:        <nova:port uuid="0f6a798c-c13a-409c-8274-1b8ad42ad19b">
Dec  5 07:43:57 np0005546954 nova_compute[187160]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:        </nova:port>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:      </nova:ports>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    </nova:instance>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:  </metadata>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:  <sysinfo type="smbios">
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <system>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:      <entry name="serial">0513a02c-7fe2-43aa-9bd6-020014460672</entry>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:      <entry name="uuid">0513a02c-7fe2-43aa-9bd6-020014460672</entry>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    </system>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:  </sysinfo>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:  <os>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <boot dev="hd"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <smbios mode="sysinfo"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:  </os>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:  <features>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <acpi/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <apic/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <vmcoreinfo/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:  </features>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:  <clock offset="utc">
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <timer name="hpet" present="no"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:  </clock>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:  <cpu mode="custom" match="exact">
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <model>Nehalem</model>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:  </cpu>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:  <devices>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <disk type="file" device="disk">
Dec  5 07:43:57 np0005546954 nova_compute[187160]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:      <source file="/var/lib/nova/instances/0513a02c-7fe2-43aa-9bd6-020014460672/disk"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:      <target dev="vda" bus="virtio"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    </disk>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <disk type="file" device="cdrom">
Dec  5 07:43:57 np0005546954 nova_compute[187160]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:      <source file="/var/lib/nova/instances/0513a02c-7fe2-43aa-9bd6-020014460672/disk.config"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:      <target dev="sda" bus="sata"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    </disk>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <interface type="ethernet">
Dec  5 07:43:57 np0005546954 nova_compute[187160]:      <mac address="fa:16:3e:1f:8f:94"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:      <model type="virtio"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:      <mtu size="1442"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:      <target dev="tap0f6a798c-c1"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    </interface>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <serial type="pty">
Dec  5 07:43:57 np0005546954 nova_compute[187160]:      <log file="/var/lib/nova/instances/0513a02c-7fe2-43aa-9bd6-020014460672/console.log" append="off"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    </serial>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <video>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:      <model type="virtio"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    </video>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <input type="tablet" bus="usb"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <rng model="virtio">
Dec  5 07:43:57 np0005546954 nova_compute[187160]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    </rng>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <controller type="usb" index="0"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    <memballoon model="virtio">
Dec  5 07:43:57 np0005546954 nova_compute[187160]:      <stats period="10"/>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:    </memballoon>
Dec  5 07:43:57 np0005546954 nova_compute[187160]:  </devices>
Dec  5 07:43:57 np0005546954 nova_compute[187160]: </domain>
Dec  5 07:43:57 np0005546954 nova_compute[187160]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.336 187164 DEBUG nova.compute.manager [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Preparing to wait for external event network-vif-plugged-0f6a798c-c13a-409c-8274-1b8ad42ad19b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.337 187164 DEBUG oslo_concurrency.lockutils [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Acquiring lock "0513a02c-7fe2-43aa-9bd6-020014460672-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.338 187164 DEBUG oslo_concurrency.lockutils [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "0513a02c-7fe2-43aa-9bd6-020014460672-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.338 187164 DEBUG oslo_concurrency.lockutils [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "0513a02c-7fe2-43aa-9bd6-020014460672-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.340 187164 DEBUG nova.virt.libvirt.vif [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:43:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1665104635',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1665104635',id=4,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6b5f383ed0484ca1bde081bf623dad4b',ramdisk_id='',reservation_id='r-3hvxvigf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1570363089',owner_user_name='tempest-TestExecuteActionsViaActuator-1570363089-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:43:47Z,user_data=None,user_id='7ce7ef64754e4a32b4af3272e31a4a5e',uuid=0513a02c-7fe2-43aa-9bd6-020014460672,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0f6a798c-c13a-409c-8274-1b8ad42ad19b", "address": "fa:16:3e:1f:8f:94", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f6a798c-c1", "ovs_interfaceid": "0f6a798c-c13a-409c-8274-1b8ad42ad19b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.340 187164 DEBUG nova.network.os_vif_util [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Converting VIF {"id": "0f6a798c-c13a-409c-8274-1b8ad42ad19b", "address": "fa:16:3e:1f:8f:94", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f6a798c-c1", "ovs_interfaceid": "0f6a798c-c13a-409c-8274-1b8ad42ad19b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.341 187164 DEBUG nova.network.os_vif_util [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1f:8f:94,bridge_name='br-int',has_traffic_filtering=True,id=0f6a798c-c13a-409c-8274-1b8ad42ad19b,network=Network(ee43e901-b158-4dc0-894f-2384aef8b277),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0f6a798c-c1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.342 187164 DEBUG os_vif [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:8f:94,bridge_name='br-int',has_traffic_filtering=True,id=0f6a798c-c13a-409c-8274-1b8ad42ad19b,network=Network(ee43e901-b158-4dc0-894f-2384aef8b277),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0f6a798c-c1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.343 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.344 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.345 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.351 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.351 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0f6a798c-c1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.352 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0f6a798c-c1, col_values=(('external_ids', {'iface-id': '0f6a798c-c13a-409c-8274-1b8ad42ad19b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1f:8f:94', 'vm-uuid': '0513a02c-7fe2-43aa-9bd6-020014460672'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.354 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:43:57 np0005546954 NetworkManager[55665]: <info>  [1764938637.3551] manager: (tap0f6a798c-c1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/24)
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.356 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.363 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.364 187164 INFO os_vif [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:8f:94,bridge_name='br-int',has_traffic_filtering=True,id=0f6a798c-c13a-409c-8274-1b8ad42ad19b,network=Network(ee43e901-b158-4dc0-894f-2384aef8b277),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0f6a798c-c1')#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.431 187164 DEBUG nova.virt.libvirt.driver [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.432 187164 DEBUG nova.virt.libvirt.driver [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.432 187164 DEBUG nova.virt.libvirt.driver [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] No VIF found with MAC fa:16:3e:1f:8f:94, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.433 187164 INFO nova.virt.libvirt.driver [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Using config drive#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.950 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:43:57 np0005546954 nova_compute[187160]: 2025-12-05 12:43:57.950 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:43:58 np0005546954 nova_compute[187160]: 2025-12-05 12:43:58.387 187164 INFO nova.virt.libvirt.driver [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Creating config drive at /var/lib/nova/instances/0513a02c-7fe2-43aa-9bd6-020014460672/disk.config#033[00m
Dec  5 07:43:58 np0005546954 nova_compute[187160]: 2025-12-05 12:43:58.397 187164 DEBUG oslo_concurrency.processutils [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0513a02c-7fe2-43aa-9bd6-020014460672/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_13y39gj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:43:58 np0005546954 nova_compute[187160]: 2025-12-05 12:43:58.528 187164 DEBUG oslo_concurrency.processutils [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0513a02c-7fe2-43aa-9bd6-020014460672/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_13y39gj" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:43:58 np0005546954 kernel: tap0f6a798c-c1: entered promiscuous mode
Dec  5 07:43:58 np0005546954 NetworkManager[55665]: <info>  [1764938638.5949] manager: (tap0f6a798c-c1): new Tun device (/org/freedesktop/NetworkManager/Devices/25)
Dec  5 07:43:58 np0005546954 nova_compute[187160]: 2025-12-05 12:43:58.637 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:43:58 np0005546954 ovn_controller[95566]: 2025-12-05T12:43:58Z|00033|binding|INFO|Claiming lport 0f6a798c-c13a-409c-8274-1b8ad42ad19b for this chassis.
Dec  5 07:43:58 np0005546954 ovn_controller[95566]: 2025-12-05T12:43:58Z|00034|binding|INFO|0f6a798c-c13a-409c-8274-1b8ad42ad19b: Claiming fa:16:3e:1f:8f:94 10.100.0.4
Dec  5 07:43:58 np0005546954 systemd-udevd[209089]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:43:58 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:58.649 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:8f:94 10.100.0.4'], port_security=['fa:16:3e:1f:8f:94 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '0513a02c-7fe2-43aa-9bd6-020014460672', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee43e901-b158-4dc0-894f-2384aef8b277', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6b5f383ed0484ca1bde081bf623dad4b', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a865bea6-413e-4ecb-bace-2ec9005935f1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8c84cb49-52df-48a9-8d24-aff5b642e12a, chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=0f6a798c-c13a-409c-8274-1b8ad42ad19b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:43:58 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:58.652 104428 INFO neutron.agent.ovn.metadata.agent [-] Port 0f6a798c-c13a-409c-8274-1b8ad42ad19b in datapath ee43e901-b158-4dc0-894f-2384aef8b277 bound to our chassis#033[00m
Dec  5 07:43:58 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:58.654 104428 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee43e901-b158-4dc0-894f-2384aef8b277#033[00m
Dec  5 07:43:58 np0005546954 NetworkManager[55665]: <info>  [1764938638.6568] device (tap0f6a798c-c1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:43:58 np0005546954 NetworkManager[55665]: <info>  [1764938638.6583] device (tap0f6a798c-c1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:43:58 np0005546954 nova_compute[187160]: 2025-12-05 12:43:58.657 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:43:58 np0005546954 ovn_controller[95566]: 2025-12-05T12:43:58Z|00035|binding|INFO|Setting lport 0f6a798c-c13a-409c-8274-1b8ad42ad19b ovn-installed in OVS
Dec  5 07:43:58 np0005546954 ovn_controller[95566]: 2025-12-05T12:43:58Z|00036|binding|INFO|Setting lport 0f6a798c-c13a-409c-8274-1b8ad42ad19b up in Southbound
Dec  5 07:43:58 np0005546954 systemd-machined[153497]: New machine qemu-2-instance-00000004.
Dec  5 07:43:58 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:58.674 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[3f8312f0-bf64-4ca1-89d8-1f49cee42261]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:43:58 np0005546954 systemd[1]: Started Virtual Machine qemu-2-instance-00000004.
Dec  5 07:43:58 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:58.709 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[eb1c4400-34e2-4cf4-961c-4b1dcb7467b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:43:58 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:58.713 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[c46c319a-deff-49dc-b64a-2564d36922aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:43:58 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:58.742 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[0dae66f8-d9b6-4490-af8a-63d2833f95b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:43:58 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:58.761 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[8476ddb8-9ecc-4f01-b81c-b2a6d0581c31]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee43e901-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:16:af:4f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 358945, 'reachable_time': 17578, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209105, 'error': None, 'target': 'ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:43:58 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:58.777 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[6cf6e96d-e445-4491-b60f-cdfdca99c741]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapee43e901-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 358963, 'tstamp': 358963}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209107, 'error': None, 'target': 'ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapee43e901-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 358968, 'tstamp': 358968}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209107, 'error': None, 'target': 'ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:43:58 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:58.780 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee43e901-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:43:58 np0005546954 nova_compute[187160]: 2025-12-05 12:43:58.782 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:43:58 np0005546954 nova_compute[187160]: 2025-12-05 12:43:58.783 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:43:58 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:58.783 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee43e901-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:43:58 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:58.784 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:43:58 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:58.784 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee43e901-b0, col_values=(('external_ids', {'iface-id': 'ff42a43f-b4ac-4be3-b747-f3c0a6e67328'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:43:58 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:43:58.784 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:43:59 np0005546954 nova_compute[187160]: 2025-12-05 12:43:59.018 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:43:59 np0005546954 nova_compute[187160]: 2025-12-05 12:43:59.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:43:59 np0005546954 nova_compute[187160]: 2025-12-05 12:43:59.062 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764938639.06193, 0513a02c-7fe2-43aa-9bd6-020014460672 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:43:59 np0005546954 nova_compute[187160]: 2025-12-05 12:43:59.063 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] VM Started (Lifecycle Event)#033[00m
Dec  5 07:43:59 np0005546954 nova_compute[187160]: 2025-12-05 12:43:59.102 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:43:59 np0005546954 nova_compute[187160]: 2025-12-05 12:43:59.107 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764938639.062213, 0513a02c-7fe2-43aa-9bd6-020014460672 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:43:59 np0005546954 nova_compute[187160]: 2025-12-05 12:43:59.108 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:43:59 np0005546954 nova_compute[187160]: 2025-12-05 12:43:59.134 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:43:59 np0005546954 nova_compute[187160]: 2025-12-05 12:43:59.139 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:43:59 np0005546954 nova_compute[187160]: 2025-12-05 12:43:59.160 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:43:59 np0005546954 nova_compute[187160]: 2025-12-05 12:43:59.246 187164 DEBUG nova.compute.manager [req-e6f53361-bb0d-4c87-aafd-dd83213e5bfa req-04d34dd2-21db-4613-90ef-f9a00a69b148 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Received event network-vif-plugged-0f6a798c-c13a-409c-8274-1b8ad42ad19b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:43:59 np0005546954 nova_compute[187160]: 2025-12-05 12:43:59.246 187164 DEBUG oslo_concurrency.lockutils [req-e6f53361-bb0d-4c87-aafd-dd83213e5bfa req-04d34dd2-21db-4613-90ef-f9a00a69b148 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "0513a02c-7fe2-43aa-9bd6-020014460672-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:43:59 np0005546954 nova_compute[187160]: 2025-12-05 12:43:59.247 187164 DEBUG oslo_concurrency.lockutils [req-e6f53361-bb0d-4c87-aafd-dd83213e5bfa req-04d34dd2-21db-4613-90ef-f9a00a69b148 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "0513a02c-7fe2-43aa-9bd6-020014460672-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:43:59 np0005546954 nova_compute[187160]: 2025-12-05 12:43:59.247 187164 DEBUG oslo_concurrency.lockutils [req-e6f53361-bb0d-4c87-aafd-dd83213e5bfa req-04d34dd2-21db-4613-90ef-f9a00a69b148 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "0513a02c-7fe2-43aa-9bd6-020014460672-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:43:59 np0005546954 nova_compute[187160]: 2025-12-05 12:43:59.248 187164 DEBUG nova.compute.manager [req-e6f53361-bb0d-4c87-aafd-dd83213e5bfa req-04d34dd2-21db-4613-90ef-f9a00a69b148 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Processing event network-vif-plugged-0f6a798c-c13a-409c-8274-1b8ad42ad19b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:43:59 np0005546954 nova_compute[187160]: 2025-12-05 12:43:59.249 187164 DEBUG nova.compute.manager [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:43:59 np0005546954 nova_compute[187160]: 2025-12-05 12:43:59.253 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764938639.253401, 0513a02c-7fe2-43aa-9bd6-020014460672 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:43:59 np0005546954 nova_compute[187160]: 2025-12-05 12:43:59.254 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:43:59 np0005546954 nova_compute[187160]: 2025-12-05 12:43:59.257 187164 DEBUG nova.virt.libvirt.driver [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:43:59 np0005546954 nova_compute[187160]: 2025-12-05 12:43:59.261 187164 INFO nova.virt.libvirt.driver [-] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Instance spawned successfully.#033[00m
Dec  5 07:43:59 np0005546954 nova_compute[187160]: 2025-12-05 12:43:59.262 187164 DEBUG nova.virt.libvirt.driver [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:43:59 np0005546954 nova_compute[187160]: 2025-12-05 12:43:59.291 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:43:59 np0005546954 nova_compute[187160]: 2025-12-05 12:43:59.300 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:43:59 np0005546954 nova_compute[187160]: 2025-12-05 12:43:59.305 187164 DEBUG nova.virt.libvirt.driver [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:43:59 np0005546954 nova_compute[187160]: 2025-12-05 12:43:59.306 187164 DEBUG nova.virt.libvirt.driver [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:43:59 np0005546954 nova_compute[187160]: 2025-12-05 12:43:59.306 187164 DEBUG nova.virt.libvirt.driver [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:43:59 np0005546954 nova_compute[187160]: 2025-12-05 12:43:59.307 187164 DEBUG nova.virt.libvirt.driver [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:43:59 np0005546954 nova_compute[187160]: 2025-12-05 12:43:59.308 187164 DEBUG nova.virt.libvirt.driver [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:43:59 np0005546954 nova_compute[187160]: 2025-12-05 12:43:59.308 187164 DEBUG nova.virt.libvirt.driver [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:43:59 np0005546954 nova_compute[187160]: 2025-12-05 12:43:59.341 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:43:59 np0005546954 nova_compute[187160]: 2025-12-05 12:43:59.379 187164 INFO nova.compute.manager [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Took 12.23 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:43:59 np0005546954 nova_compute[187160]: 2025-12-05 12:43:59.380 187164 DEBUG nova.compute.manager [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:43:59 np0005546954 nova_compute[187160]: 2025-12-05 12:43:59.449 187164 INFO nova.compute.manager [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Took 12.93 seconds to build instance.#033[00m
Dec  5 07:43:59 np0005546954 nova_compute[187160]: 2025-12-05 12:43:59.469 187164 DEBUG oslo_concurrency.lockutils [None req-74e33f3e-5c50-4acd-9058-1dc15ff47025 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "0513a02c-7fe2-43aa-9bd6-020014460672" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.014s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:44:00 np0005546954 nova_compute[187160]: 2025-12-05 12:44:00.279 187164 DEBUG nova.network.neutron [req-e070cee0-518a-4a58-8591-9a324e83092c req-e8338cac-8c8f-4bbb-8b60-616cfa1670cc 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Updated VIF entry in instance network info cache for port 0f6a798c-c13a-409c-8274-1b8ad42ad19b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:44:00 np0005546954 nova_compute[187160]: 2025-12-05 12:44:00.280 187164 DEBUG nova.network.neutron [req-e070cee0-518a-4a58-8591-9a324e83092c req-e8338cac-8c8f-4bbb-8b60-616cfa1670cc 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Updating instance_info_cache with network_info: [{"id": "0f6a798c-c13a-409c-8274-1b8ad42ad19b", "address": "fa:16:3e:1f:8f:94", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f6a798c-c1", "ovs_interfaceid": "0f6a798c-c13a-409c-8274-1b8ad42ad19b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:44:00 np0005546954 nova_compute[187160]: 2025-12-05 12:44:00.303 187164 DEBUG oslo_concurrency.lockutils [req-e070cee0-518a-4a58-8591-9a324e83092c req-e8338cac-8c8f-4bbb-8b60-616cfa1670cc 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Releasing lock "refresh_cache-0513a02c-7fe2-43aa-9bd6-020014460672" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:44:01 np0005546954 nova_compute[187160]: 2025-12-05 12:44:01.332 187164 DEBUG nova.compute.manager [req-76e44973-6ed2-44f8-af48-07bbdf31f38c req-bb02a011-f1ff-4a6c-bfa2-f6d23a1b4da5 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Received event network-vif-plugged-0f6a798c-c13a-409c-8274-1b8ad42ad19b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:44:01 np0005546954 nova_compute[187160]: 2025-12-05 12:44:01.333 187164 DEBUG oslo_concurrency.lockutils [req-76e44973-6ed2-44f8-af48-07bbdf31f38c req-bb02a011-f1ff-4a6c-bfa2-f6d23a1b4da5 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "0513a02c-7fe2-43aa-9bd6-020014460672-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:44:01 np0005546954 nova_compute[187160]: 2025-12-05 12:44:01.333 187164 DEBUG oslo_concurrency.lockutils [req-76e44973-6ed2-44f8-af48-07bbdf31f38c req-bb02a011-f1ff-4a6c-bfa2-f6d23a1b4da5 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "0513a02c-7fe2-43aa-9bd6-020014460672-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:44:01 np0005546954 nova_compute[187160]: 2025-12-05 12:44:01.334 187164 DEBUG oslo_concurrency.lockutils [req-76e44973-6ed2-44f8-af48-07bbdf31f38c req-bb02a011-f1ff-4a6c-bfa2-f6d23a1b4da5 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "0513a02c-7fe2-43aa-9bd6-020014460672-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:44:01 np0005546954 nova_compute[187160]: 2025-12-05 12:44:01.334 187164 DEBUG nova.compute.manager [req-76e44973-6ed2-44f8-af48-07bbdf31f38c req-bb02a011-f1ff-4a6c-bfa2-f6d23a1b4da5 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] No waiting events found dispatching network-vif-plugged-0f6a798c-c13a-409c-8274-1b8ad42ad19b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:44:01 np0005546954 nova_compute[187160]: 2025-12-05 12:44:01.334 187164 WARNING nova.compute.manager [req-76e44973-6ed2-44f8-af48-07bbdf31f38c req-bb02a011-f1ff-4a6c-bfa2-f6d23a1b4da5 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Received unexpected event network-vif-plugged-0f6a798c-c13a-409c-8274-1b8ad42ad19b for instance with vm_state active and task_state None.#033[00m
Dec  5 07:44:02 np0005546954 nova_compute[187160]: 2025-12-05 12:44:02.392 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:04 np0005546954 nova_compute[187160]: 2025-12-05 12:44:04.022 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:05 np0005546954 podman[209115]: 2025-12-05 12:44:05.584000101 +0000 UTC m=+0.077710175 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec  5 07:44:05 np0005546954 podman[197513]: time="2025-12-05T12:44:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:44:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:44:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Dec  5 07:44:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:44:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3039 "" "Go-http-client/1.1"
Dec  5 07:44:07 np0005546954 nova_compute[187160]: 2025-12-05 12:44:07.396 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:09 np0005546954 nova_compute[187160]: 2025-12-05 12:44:09.025 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:10 np0005546954 podman[209141]: 2025-12-05 12:44:10.563957114 +0000 UTC m=+0.076116555 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 07:44:11 np0005546954 ovn_controller[95566]: 2025-12-05T12:44:11Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1f:8f:94 10.100.0.4
Dec  5 07:44:11 np0005546954 ovn_controller[95566]: 2025-12-05T12:44:11Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1f:8f:94 10.100.0.4
Dec  5 07:44:11 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:11.913 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2a:56:46', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '32:90:88:ab:74:32'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:44:11 np0005546954 nova_compute[187160]: 2025-12-05 12:44:11.914 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:11 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:11.915 104428 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 07:44:12 np0005546954 nova_compute[187160]: 2025-12-05 12:44:12.399 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:12 np0005546954 podman[209185]: 2025-12-05 12:44:12.585602427 +0000 UTC m=+0.099560556 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Dec  5 07:44:14 np0005546954 nova_compute[187160]: 2025-12-05 12:44:14.031 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:16.941 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:44:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:16.941 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:44:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:16.942 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:44:17 np0005546954 nova_compute[187160]: 2025-12-05 12:44:17.402 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:18 np0005546954 nova_compute[187160]: 2025-12-05 12:44:18.669 187164 DEBUG oslo_concurrency.lockutils [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Acquiring lock "a133dad5-02c4-4021-90e5-ee9f3322f351" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:44:18 np0005546954 nova_compute[187160]: 2025-12-05 12:44:18.670 187164 DEBUG oslo_concurrency.lockutils [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "a133dad5-02c4-4021-90e5-ee9f3322f351" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:44:18 np0005546954 nova_compute[187160]: 2025-12-05 12:44:18.697 187164 DEBUG nova.compute.manager [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:44:18 np0005546954 nova_compute[187160]: 2025-12-05 12:44:18.774 187164 DEBUG oslo_concurrency.lockutils [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:44:18 np0005546954 nova_compute[187160]: 2025-12-05 12:44:18.775 187164 DEBUG oslo_concurrency.lockutils [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:44:18 np0005546954 nova_compute[187160]: 2025-12-05 12:44:18.783 187164 DEBUG nova.virt.hardware [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:44:18 np0005546954 nova_compute[187160]: 2025-12-05 12:44:18.783 187164 INFO nova.compute.claims [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Claim successful on node compute-1.ctlplane.example.com#033[00m
Dec  5 07:44:19 np0005546954 nova_compute[187160]: 2025-12-05 12:44:19.036 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:19 np0005546954 nova_compute[187160]: 2025-12-05 12:44:19.147 187164 DEBUG nova.compute.provider_tree [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:44:19 np0005546954 nova_compute[187160]: 2025-12-05 12:44:19.162 187164 DEBUG nova.scheduler.client.report [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:44:19 np0005546954 nova_compute[187160]: 2025-12-05 12:44:19.182 187164 DEBUG oslo_concurrency.lockutils [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.407s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:44:19 np0005546954 nova_compute[187160]: 2025-12-05 12:44:19.183 187164 DEBUG nova.compute.manager [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:44:19 np0005546954 nova_compute[187160]: 2025-12-05 12:44:19.237 187164 DEBUG nova.compute.manager [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:44:19 np0005546954 nova_compute[187160]: 2025-12-05 12:44:19.237 187164 DEBUG nova.network.neutron [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:44:19 np0005546954 nova_compute[187160]: 2025-12-05 12:44:19.257 187164 INFO nova.virt.libvirt.driver [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:44:19 np0005546954 nova_compute[187160]: 2025-12-05 12:44:19.288 187164 DEBUG nova.compute.manager [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:44:19 np0005546954 nova_compute[187160]: 2025-12-05 12:44:19.383 187164 DEBUG nova.compute.manager [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:44:19 np0005546954 nova_compute[187160]: 2025-12-05 12:44:19.384 187164 DEBUG nova.virt.libvirt.driver [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:44:19 np0005546954 nova_compute[187160]: 2025-12-05 12:44:19.385 187164 INFO nova.virt.libvirt.driver [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Creating image(s)#033[00m
Dec  5 07:44:19 np0005546954 nova_compute[187160]: 2025-12-05 12:44:19.387 187164 DEBUG oslo_concurrency.lockutils [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Acquiring lock "/var/lib/nova/instances/a133dad5-02c4-4021-90e5-ee9f3322f351/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:44:19 np0005546954 nova_compute[187160]: 2025-12-05 12:44:19.387 187164 DEBUG oslo_concurrency.lockutils [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "/var/lib/nova/instances/a133dad5-02c4-4021-90e5-ee9f3322f351/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:44:19 np0005546954 nova_compute[187160]: 2025-12-05 12:44:19.388 187164 DEBUG oslo_concurrency.lockutils [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "/var/lib/nova/instances/a133dad5-02c4-4021-90e5-ee9f3322f351/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:44:19 np0005546954 nova_compute[187160]: 2025-12-05 12:44:19.401 187164 DEBUG oslo_concurrency.processutils [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:44:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:44:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 07:44:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:44:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:44:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:44:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:44:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:44:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 07:44:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:44:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:44:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 07:44:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:44:19 np0005546954 nova_compute[187160]: 2025-12-05 12:44:19.465 187164 DEBUG oslo_concurrency.processutils [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:44:19 np0005546954 nova_compute[187160]: 2025-12-05 12:44:19.467 187164 DEBUG oslo_concurrency.lockutils [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Acquiring lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:44:19 np0005546954 nova_compute[187160]: 2025-12-05 12:44:19.467 187164 DEBUG oslo_concurrency.lockutils [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:44:19 np0005546954 nova_compute[187160]: 2025-12-05 12:44:19.478 187164 DEBUG oslo_concurrency.processutils [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:44:19 np0005546954 nova_compute[187160]: 2025-12-05 12:44:19.534 187164 DEBUG oslo_concurrency.processutils [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:44:19 np0005546954 nova_compute[187160]: 2025-12-05 12:44:19.536 187164 DEBUG oslo_concurrency.processutils [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/a133dad5-02c4-4021-90e5-ee9f3322f351/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:44:19 np0005546954 nova_compute[187160]: 2025-12-05 12:44:19.569 187164 DEBUG oslo_concurrency.processutils [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/a133dad5-02c4-4021-90e5-ee9f3322f351/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:44:19 np0005546954 nova_compute[187160]: 2025-12-05 12:44:19.570 187164 DEBUG oslo_concurrency.lockutils [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:44:19 np0005546954 nova_compute[187160]: 2025-12-05 12:44:19.571 187164 DEBUG oslo_concurrency.processutils [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:44:19 np0005546954 nova_compute[187160]: 2025-12-05 12:44:19.626 187164 DEBUG oslo_concurrency.processutils [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:44:19 np0005546954 nova_compute[187160]: 2025-12-05 12:44:19.628 187164 DEBUG nova.virt.disk.api [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Checking if we can resize image /var/lib/nova/instances/a133dad5-02c4-4021-90e5-ee9f3322f351/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:44:19 np0005546954 nova_compute[187160]: 2025-12-05 12:44:19.628 187164 DEBUG oslo_concurrency.processutils [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a133dad5-02c4-4021-90e5-ee9f3322f351/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:44:19 np0005546954 nova_compute[187160]: 2025-12-05 12:44:19.682 187164 DEBUG oslo_concurrency.processutils [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a133dad5-02c4-4021-90e5-ee9f3322f351/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:44:19 np0005546954 nova_compute[187160]: 2025-12-05 12:44:19.684 187164 DEBUG nova.virt.disk.api [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Cannot resize image /var/lib/nova/instances/a133dad5-02c4-4021-90e5-ee9f3322f351/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:44:19 np0005546954 nova_compute[187160]: 2025-12-05 12:44:19.684 187164 DEBUG nova.objects.instance [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lazy-loading 'migration_context' on Instance uuid a133dad5-02c4-4021-90e5-ee9f3322f351 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:44:19 np0005546954 nova_compute[187160]: 2025-12-05 12:44:19.709 187164 DEBUG nova.virt.libvirt.driver [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:44:19 np0005546954 nova_compute[187160]: 2025-12-05 12:44:19.709 187164 DEBUG nova.virt.libvirt.driver [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Ensure instance console log exists: /var/lib/nova/instances/a133dad5-02c4-4021-90e5-ee9f3322f351/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:44:19 np0005546954 nova_compute[187160]: 2025-12-05 12:44:19.710 187164 DEBUG oslo_concurrency.lockutils [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:44:19 np0005546954 nova_compute[187160]: 2025-12-05 12:44:19.710 187164 DEBUG oslo_concurrency.lockutils [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:44:19 np0005546954 nova_compute[187160]: 2025-12-05 12:44:19.710 187164 DEBUG oslo_concurrency.lockutils [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:44:20 np0005546954 nova_compute[187160]: 2025-12-05 12:44:20.369 187164 DEBUG nova.policy [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7ce7ef64754e4a32b4af3272e31a4a5e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6b5f383ed0484ca1bde081bf623dad4b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:44:21 np0005546954 nova_compute[187160]: 2025-12-05 12:44:21.237 187164 DEBUG nova.network.neutron [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Successfully created port: 1f839a4f-9bf4-4b39-aca7-83959475d57e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:44:21 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:21.919 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f9f74c-08f9-451f-9678-93bb9e8fa2fe, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:44:22 np0005546954 nova_compute[187160]: 2025-12-05 12:44:22.442 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:22 np0005546954 nova_compute[187160]: 2025-12-05 12:44:22.558 187164 DEBUG nova.network.neutron [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Successfully updated port: 1f839a4f-9bf4-4b39-aca7-83959475d57e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:44:22 np0005546954 nova_compute[187160]: 2025-12-05 12:44:22.576 187164 DEBUG oslo_concurrency.lockutils [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Acquiring lock "refresh_cache-a133dad5-02c4-4021-90e5-ee9f3322f351" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:44:22 np0005546954 nova_compute[187160]: 2025-12-05 12:44:22.576 187164 DEBUG oslo_concurrency.lockutils [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Acquired lock "refresh_cache-a133dad5-02c4-4021-90e5-ee9f3322f351" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:44:22 np0005546954 nova_compute[187160]: 2025-12-05 12:44:22.577 187164 DEBUG nova.network.neutron [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:44:22 np0005546954 nova_compute[187160]: 2025-12-05 12:44:22.665 187164 DEBUG nova.compute.manager [req-53e04f8e-38f2-4d16-b251-fb7984bfc8e5 req-813b8f1f-5f2a-48f6-8606-aa2c7ad5dba6 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Received event network-changed-1f839a4f-9bf4-4b39-aca7-83959475d57e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:44:22 np0005546954 nova_compute[187160]: 2025-12-05 12:44:22.666 187164 DEBUG nova.compute.manager [req-53e04f8e-38f2-4d16-b251-fb7984bfc8e5 req-813b8f1f-5f2a-48f6-8606-aa2c7ad5dba6 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Refreshing instance network info cache due to event network-changed-1f839a4f-9bf4-4b39-aca7-83959475d57e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:44:22 np0005546954 nova_compute[187160]: 2025-12-05 12:44:22.666 187164 DEBUG oslo_concurrency.lockutils [req-53e04f8e-38f2-4d16-b251-fb7984bfc8e5 req-813b8f1f-5f2a-48f6-8606-aa2c7ad5dba6 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "refresh_cache-a133dad5-02c4-4021-90e5-ee9f3322f351" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:44:23 np0005546954 nova_compute[187160]: 2025-12-05 12:44:23.322 187164 DEBUG nova.network.neutron [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:44:23 np0005546954 podman[209228]: 2025-12-05 12:44:23.569537741 +0000 UTC m=+0.069122354 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec  5 07:44:23 np0005546954 podman[209227]: 2025-12-05 12:44:23.582030966 +0000 UTC m=+0.085833771 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350)
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.038 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.610 187164 DEBUG nova.network.neutron [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Updating instance_info_cache with network_info: [{"id": "1f839a4f-9bf4-4b39-aca7-83959475d57e", "address": "fa:16:3e:bd:0e:16", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f839a4f-9b", "ovs_interfaceid": "1f839a4f-9bf4-4b39-aca7-83959475d57e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.629 187164 DEBUG oslo_concurrency.lockutils [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Releasing lock "refresh_cache-a133dad5-02c4-4021-90e5-ee9f3322f351" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.630 187164 DEBUG nova.compute.manager [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Instance network_info: |[{"id": "1f839a4f-9bf4-4b39-aca7-83959475d57e", "address": "fa:16:3e:bd:0e:16", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f839a4f-9b", "ovs_interfaceid": "1f839a4f-9bf4-4b39-aca7-83959475d57e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.631 187164 DEBUG oslo_concurrency.lockutils [req-53e04f8e-38f2-4d16-b251-fb7984bfc8e5 req-813b8f1f-5f2a-48f6-8606-aa2c7ad5dba6 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquired lock "refresh_cache-a133dad5-02c4-4021-90e5-ee9f3322f351" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.631 187164 DEBUG nova.network.neutron [req-53e04f8e-38f2-4d16-b251-fb7984bfc8e5 req-813b8f1f-5f2a-48f6-8606-aa2c7ad5dba6 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Refreshing network info cache for port 1f839a4f-9bf4-4b39-aca7-83959475d57e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.636 187164 DEBUG nova.virt.libvirt.driver [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Start _get_guest_xml network_info=[{"id": "1f839a4f-9bf4-4b39-aca7-83959475d57e", "address": "fa:16:3e:bd:0e:16", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f839a4f-9b", "ovs_interfaceid": "1f839a4f-9bf4-4b39-aca7-83959475d57e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T12:39:17Z,direct_url=<?>,disk_format='qcow2',id=f4c3125a-6fd0-40bb-aa00-a7e736ee853d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='83916c53de6f404f91206339303e1b23',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T12:39:21Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'size': 0, 'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'encrypted': False, 'image_id': 'f4c3125a-6fd0-40bb-aa00-a7e736ee853d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.644 187164 WARNING nova.virt.libvirt.driver [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.653 187164 DEBUG nova.virt.libvirt.host [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.653 187164 DEBUG nova.virt.libvirt.host [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.657 187164 DEBUG nova.virt.libvirt.host [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.658 187164 DEBUG nova.virt.libvirt.host [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.659 187164 DEBUG nova.virt.libvirt.driver [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.659 187164 DEBUG nova.virt.hardware [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T12:39:15Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b4ea63be-97f8-4a48-b000-66321c4ddb27',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T12:39:17Z,direct_url=<?>,disk_format='qcow2',id=f4c3125a-6fd0-40bb-aa00-a7e736ee853d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='83916c53de6f404f91206339303e1b23',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T12:39:21Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.659 187164 DEBUG nova.virt.hardware [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.660 187164 DEBUG nova.virt.hardware [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.660 187164 DEBUG nova.virt.hardware [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.660 187164 DEBUG nova.virt.hardware [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.660 187164 DEBUG nova.virt.hardware [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.660 187164 DEBUG nova.virt.hardware [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.661 187164 DEBUG nova.virt.hardware [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.661 187164 DEBUG nova.virt.hardware [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.661 187164 DEBUG nova.virt.hardware [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.661 187164 DEBUG nova.virt.hardware [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.664 187164 DEBUG nova.virt.libvirt.vif [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:44:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1937603378',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1937603378',id=6,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6b5f383ed0484ca1bde081bf623dad4b',ramdisk_id='',reservation_id='r-kl33bqt4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1570363089',owner_user_name='tempest-TestExecuteActionsViaActuator-1570363089-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:44:19Z,user_data=None,user_id='7ce7ef64754e4a32b4af3272e31a4a5e',uuid=a133dad5-02c4-4021-90e5-ee9f3322f351,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1f839a4f-9bf4-4b39-aca7-83959475d57e", "address": "fa:16:3e:bd:0e:16", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f839a4f-9b", "ovs_interfaceid": "1f839a4f-9bf4-4b39-aca7-83959475d57e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.665 187164 DEBUG nova.network.os_vif_util [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Converting VIF {"id": "1f839a4f-9bf4-4b39-aca7-83959475d57e", "address": "fa:16:3e:bd:0e:16", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f839a4f-9b", "ovs_interfaceid": "1f839a4f-9bf4-4b39-aca7-83959475d57e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.665 187164 DEBUG nova.network.os_vif_util [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bd:0e:16,bridge_name='br-int',has_traffic_filtering=True,id=1f839a4f-9bf4-4b39-aca7-83959475d57e,network=Network(ee43e901-b158-4dc0-894f-2384aef8b277),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f839a4f-9b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.666 187164 DEBUG nova.objects.instance [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lazy-loading 'pci_devices' on Instance uuid a133dad5-02c4-4021-90e5-ee9f3322f351 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.679 187164 DEBUG nova.virt.libvirt.driver [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:44:24 np0005546954 nova_compute[187160]:  <uuid>a133dad5-02c4-4021-90e5-ee9f3322f351</uuid>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:  <name>instance-00000006</name>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:  <memory>131072</memory>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:  <vcpu>1</vcpu>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:  <metadata>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:44:24 np0005546954 nova_compute[187160]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:      <nova:name>tempest-TestExecuteActionsViaActuator-server-1937603378</nova:name>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:      <nova:creationTime>2025-12-05 12:44:24</nova:creationTime>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:      <nova:flavor name="m1.nano">
Dec  5 07:44:24 np0005546954 nova_compute[187160]:        <nova:memory>128</nova:memory>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:        <nova:disk>1</nova:disk>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:        <nova:swap>0</nova:swap>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:      </nova:flavor>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:      <nova:owner>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:        <nova:user uuid="7ce7ef64754e4a32b4af3272e31a4a5e">tempest-TestExecuteActionsViaActuator-1570363089-project-member</nova:user>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:        <nova:project uuid="6b5f383ed0484ca1bde081bf623dad4b">tempest-TestExecuteActionsViaActuator-1570363089</nova:project>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:      </nova:owner>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:      <nova:root type="image" uuid="f4c3125a-6fd0-40bb-aa00-a7e736ee853d"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:      <nova:ports>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:        <nova:port uuid="1f839a4f-9bf4-4b39-aca7-83959475d57e">
Dec  5 07:44:24 np0005546954 nova_compute[187160]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:        </nova:port>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:      </nova:ports>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    </nova:instance>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:  </metadata>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:  <sysinfo type="smbios">
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <system>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:      <entry name="serial">a133dad5-02c4-4021-90e5-ee9f3322f351</entry>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:      <entry name="uuid">a133dad5-02c4-4021-90e5-ee9f3322f351</entry>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    </system>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:  </sysinfo>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:  <os>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <boot dev="hd"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <smbios mode="sysinfo"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:  </os>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:  <features>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <acpi/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <apic/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <vmcoreinfo/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:  </features>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:  <clock offset="utc">
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <timer name="hpet" present="no"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:  </clock>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:  <cpu mode="custom" match="exact">
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <model>Nehalem</model>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:  </cpu>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:  <devices>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <disk type="file" device="disk">
Dec  5 07:44:24 np0005546954 nova_compute[187160]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:      <source file="/var/lib/nova/instances/a133dad5-02c4-4021-90e5-ee9f3322f351/disk"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:      <target dev="vda" bus="virtio"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    </disk>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <disk type="file" device="cdrom">
Dec  5 07:44:24 np0005546954 nova_compute[187160]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:      <source file="/var/lib/nova/instances/a133dad5-02c4-4021-90e5-ee9f3322f351/disk.config"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:      <target dev="sda" bus="sata"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    </disk>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <interface type="ethernet">
Dec  5 07:44:24 np0005546954 nova_compute[187160]:      <mac address="fa:16:3e:bd:0e:16"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:      <model type="virtio"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:      <mtu size="1442"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:      <target dev="tap1f839a4f-9b"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    </interface>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <serial type="pty">
Dec  5 07:44:24 np0005546954 nova_compute[187160]:      <log file="/var/lib/nova/instances/a133dad5-02c4-4021-90e5-ee9f3322f351/console.log" append="off"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    </serial>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <video>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:      <model type="virtio"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    </video>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <input type="tablet" bus="usb"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <rng model="virtio">
Dec  5 07:44:24 np0005546954 nova_compute[187160]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    </rng>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <controller type="usb" index="0"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    <memballoon model="virtio">
Dec  5 07:44:24 np0005546954 nova_compute[187160]:      <stats period="10"/>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:    </memballoon>
Dec  5 07:44:24 np0005546954 nova_compute[187160]:  </devices>
Dec  5 07:44:24 np0005546954 nova_compute[187160]: </domain>
Dec  5 07:44:24 np0005546954 nova_compute[187160]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.680 187164 DEBUG nova.compute.manager [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Preparing to wait for external event network-vif-plugged-1f839a4f-9bf4-4b39-aca7-83959475d57e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.680 187164 DEBUG oslo_concurrency.lockutils [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Acquiring lock "a133dad5-02c4-4021-90e5-ee9f3322f351-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.681 187164 DEBUG oslo_concurrency.lockutils [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "a133dad5-02c4-4021-90e5-ee9f3322f351-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.681 187164 DEBUG oslo_concurrency.lockutils [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "a133dad5-02c4-4021-90e5-ee9f3322f351-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.682 187164 DEBUG nova.virt.libvirt.vif [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:44:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1937603378',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1937603378',id=6,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6b5f383ed0484ca1bde081bf623dad4b',ramdisk_id='',reservation_id='r-kl33bqt4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1570363089',owner_user_name='tempest-TestExecuteActionsViaActuator-1570363089-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:44:19Z,user_data=None,user_id='7ce7ef64754e4a32b4af3272e31a4a5e',uuid=a133dad5-02c4-4021-90e5-ee9f3322f351,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1f839a4f-9bf4-4b39-aca7-83959475d57e", "address": "fa:16:3e:bd:0e:16", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f839a4f-9b", "ovs_interfaceid": "1f839a4f-9bf4-4b39-aca7-83959475d57e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.682 187164 DEBUG nova.network.os_vif_util [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Converting VIF {"id": "1f839a4f-9bf4-4b39-aca7-83959475d57e", "address": "fa:16:3e:bd:0e:16", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f839a4f-9b", "ovs_interfaceid": "1f839a4f-9bf4-4b39-aca7-83959475d57e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.682 187164 DEBUG nova.network.os_vif_util [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bd:0e:16,bridge_name='br-int',has_traffic_filtering=True,id=1f839a4f-9bf4-4b39-aca7-83959475d57e,network=Network(ee43e901-b158-4dc0-894f-2384aef8b277),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f839a4f-9b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.683 187164 DEBUG os_vif [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bd:0e:16,bridge_name='br-int',has_traffic_filtering=True,id=1f839a4f-9bf4-4b39-aca7-83959475d57e,network=Network(ee43e901-b158-4dc0-894f-2384aef8b277),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f839a4f-9b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.683 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.683 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.684 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.689 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.689 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1f839a4f-9b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.690 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1f839a4f-9b, col_values=(('external_ids', {'iface-id': '1f839a4f-9bf4-4b39-aca7-83959475d57e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bd:0e:16', 'vm-uuid': 'a133dad5-02c4-4021-90e5-ee9f3322f351'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.692 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:24 np0005546954 NetworkManager[55665]: <info>  [1764938664.6944] manager: (tap1f839a4f-9b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/26)
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.695 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.702 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.704 187164 INFO os_vif [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bd:0e:16,bridge_name='br-int',has_traffic_filtering=True,id=1f839a4f-9bf4-4b39-aca7-83959475d57e,network=Network(ee43e901-b158-4dc0-894f-2384aef8b277),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f839a4f-9b')#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.762 187164 DEBUG nova.virt.libvirt.driver [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.763 187164 DEBUG nova.virt.libvirt.driver [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.763 187164 DEBUG nova.virt.libvirt.driver [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] No VIF found with MAC fa:16:3e:bd:0e:16, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:44:24 np0005546954 nova_compute[187160]: 2025-12-05 12:44:24.764 187164 INFO nova.virt.libvirt.driver [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Using config drive#033[00m
Dec  5 07:44:25 np0005546954 nova_compute[187160]: 2025-12-05 12:44:25.678 187164 INFO nova.virt.libvirt.driver [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Creating config drive at /var/lib/nova/instances/a133dad5-02c4-4021-90e5-ee9f3322f351/disk.config#033[00m
Dec  5 07:44:25 np0005546954 nova_compute[187160]: 2025-12-05 12:44:25.684 187164 DEBUG oslo_concurrency.processutils [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a133dad5-02c4-4021-90e5-ee9f3322f351/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpib5miw9v execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:44:25 np0005546954 nova_compute[187160]: 2025-12-05 12:44:25.813 187164 DEBUG oslo_concurrency.processutils [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a133dad5-02c4-4021-90e5-ee9f3322f351/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpib5miw9v" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:44:25 np0005546954 kernel: tap1f839a4f-9b: entered promiscuous mode
Dec  5 07:44:25 np0005546954 NetworkManager[55665]: <info>  [1764938665.8921] manager: (tap1f839a4f-9b): new Tun device (/org/freedesktop/NetworkManager/Devices/27)
Dec  5 07:44:25 np0005546954 ovn_controller[95566]: 2025-12-05T12:44:25Z|00037|binding|INFO|Claiming lport 1f839a4f-9bf4-4b39-aca7-83959475d57e for this chassis.
Dec  5 07:44:25 np0005546954 ovn_controller[95566]: 2025-12-05T12:44:25Z|00038|binding|INFO|1f839a4f-9bf4-4b39-aca7-83959475d57e: Claiming fa:16:3e:bd:0e:16 10.100.0.7
Dec  5 07:44:25 np0005546954 nova_compute[187160]: 2025-12-05 12:44:25.893 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:25 np0005546954 ovn_controller[95566]: 2025-12-05T12:44:25Z|00039|binding|INFO|Setting lport 1f839a4f-9bf4-4b39-aca7-83959475d57e ovn-installed in OVS
Dec  5 07:44:25 np0005546954 nova_compute[187160]: 2025-12-05 12:44:25.923 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:25 np0005546954 systemd-udevd[209287]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:44:25 np0005546954 nova_compute[187160]: 2025-12-05 12:44:25.930 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:25 np0005546954 NetworkManager[55665]: <info>  [1764938665.9461] device (tap1f839a4f-9b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:44:25 np0005546954 NetworkManager[55665]: <info>  [1764938665.9471] device (tap1f839a4f-9b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:44:25 np0005546954 systemd-machined[153497]: New machine qemu-3-instance-00000006.
Dec  5 07:44:25 np0005546954 systemd[1]: Started Virtual Machine qemu-3-instance-00000006.
Dec  5 07:44:26 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:26.068 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bd:0e:16 10.100.0.7'], port_security=['fa:16:3e:bd:0e:16 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'a133dad5-02c4-4021-90e5-ee9f3322f351', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee43e901-b158-4dc0-894f-2384aef8b277', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6b5f383ed0484ca1bde081bf623dad4b', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a865bea6-413e-4ecb-bace-2ec9005935f1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8c84cb49-52df-48a9-8d24-aff5b642e12a, chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], tunnel_key=8, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=1f839a4f-9bf4-4b39-aca7-83959475d57e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:44:26 np0005546954 ovn_controller[95566]: 2025-12-05T12:44:26Z|00040|binding|INFO|Setting lport 1f839a4f-9bf4-4b39-aca7-83959475d57e up in Southbound
Dec  5 07:44:26 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:26.073 104428 INFO neutron.agent.ovn.metadata.agent [-] Port 1f839a4f-9bf4-4b39-aca7-83959475d57e in datapath ee43e901-b158-4dc0-894f-2384aef8b277 bound to our chassis#033[00m
Dec  5 07:44:26 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:26.077 104428 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee43e901-b158-4dc0-894f-2384aef8b277#033[00m
Dec  5 07:44:26 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:26.094 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[2354cfee-c043-4866-aaf9-fad404de827b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:44:26 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:26.130 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[bddafd74-3f26-4f90-9780-a84cfe0ab9c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:44:26 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:26.135 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[9d0982ed-77ea-4925-b1d7-cfb97ab44182]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:44:26 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:26.171 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[daae3fca-7f48-4f17-8d2b-ba3233082594]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:44:26 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:26.197 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[e5ff650c-a3f5-45a8-a433-cb61b37f5231]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee43e901-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:16:af:4f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 358945, 'reachable_time': 17578, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209304, 'error': None, 'target': 'ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:44:26 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:26.217 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[b9d1a39b-4863-43e8-a9a0-dd429844e1f7]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapee43e901-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 358963, 'tstamp': 358963}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209305, 'error': None, 'target': 'ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapee43e901-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 358968, 'tstamp': 358968}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209305, 'error': None, 'target': 'ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:44:26 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:26.219 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee43e901-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:44:26 np0005546954 nova_compute[187160]: 2025-12-05 12:44:26.221 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:26 np0005546954 nova_compute[187160]: 2025-12-05 12:44:26.222 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:26 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:26.223 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee43e901-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:44:26 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:26.224 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:44:26 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:26.225 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee43e901-b0, col_values=(('external_ids', {'iface-id': 'ff42a43f-b4ac-4be3-b747-f3c0a6e67328'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:44:26 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:26.225 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:44:26 np0005546954 nova_compute[187160]: 2025-12-05 12:44:26.506 187164 DEBUG nova.compute.manager [req-9d8d26e0-c021-474c-a969-305b39d8ce45 req-dcb655c0-f73f-4c56-a188-b328ba2b00fa 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Received event network-vif-plugged-1f839a4f-9bf4-4b39-aca7-83959475d57e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:44:26 np0005546954 nova_compute[187160]: 2025-12-05 12:44:26.506 187164 DEBUG oslo_concurrency.lockutils [req-9d8d26e0-c021-474c-a969-305b39d8ce45 req-dcb655c0-f73f-4c56-a188-b328ba2b00fa 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "a133dad5-02c4-4021-90e5-ee9f3322f351-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:44:26 np0005546954 nova_compute[187160]: 2025-12-05 12:44:26.507 187164 DEBUG oslo_concurrency.lockutils [req-9d8d26e0-c021-474c-a969-305b39d8ce45 req-dcb655c0-f73f-4c56-a188-b328ba2b00fa 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "a133dad5-02c4-4021-90e5-ee9f3322f351-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:44:26 np0005546954 nova_compute[187160]: 2025-12-05 12:44:26.507 187164 DEBUG oslo_concurrency.lockutils [req-9d8d26e0-c021-474c-a969-305b39d8ce45 req-dcb655c0-f73f-4c56-a188-b328ba2b00fa 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "a133dad5-02c4-4021-90e5-ee9f3322f351-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:44:26 np0005546954 nova_compute[187160]: 2025-12-05 12:44:26.508 187164 DEBUG nova.compute.manager [req-9d8d26e0-c021-474c-a969-305b39d8ce45 req-dcb655c0-f73f-4c56-a188-b328ba2b00fa 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Processing event network-vif-plugged-1f839a4f-9bf4-4b39-aca7-83959475d57e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:44:27 np0005546954 nova_compute[187160]: 2025-12-05 12:44:27.343 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764938667.342518, a133dad5-02c4-4021-90e5-ee9f3322f351 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:44:27 np0005546954 nova_compute[187160]: 2025-12-05 12:44:27.345 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] VM Started (Lifecycle Event)#033[00m
Dec  5 07:44:27 np0005546954 nova_compute[187160]: 2025-12-05 12:44:27.348 187164 DEBUG nova.compute.manager [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:44:27 np0005546954 nova_compute[187160]: 2025-12-05 12:44:27.353 187164 DEBUG nova.virt.libvirt.driver [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:44:27 np0005546954 nova_compute[187160]: 2025-12-05 12:44:27.357 187164 INFO nova.virt.libvirt.driver [-] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Instance spawned successfully.#033[00m
Dec  5 07:44:27 np0005546954 nova_compute[187160]: 2025-12-05 12:44:27.357 187164 DEBUG nova.virt.libvirt.driver [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:44:27 np0005546954 nova_compute[187160]: 2025-12-05 12:44:27.361 187164 DEBUG nova.network.neutron [req-53e04f8e-38f2-4d16-b251-fb7984bfc8e5 req-813b8f1f-5f2a-48f6-8606-aa2c7ad5dba6 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Updated VIF entry in instance network info cache for port 1f839a4f-9bf4-4b39-aca7-83959475d57e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:44:27 np0005546954 nova_compute[187160]: 2025-12-05 12:44:27.361 187164 DEBUG nova.network.neutron [req-53e04f8e-38f2-4d16-b251-fb7984bfc8e5 req-813b8f1f-5f2a-48f6-8606-aa2c7ad5dba6 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Updating instance_info_cache with network_info: [{"id": "1f839a4f-9bf4-4b39-aca7-83959475d57e", "address": "fa:16:3e:bd:0e:16", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f839a4f-9b", "ovs_interfaceid": "1f839a4f-9bf4-4b39-aca7-83959475d57e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:44:27 np0005546954 nova_compute[187160]: 2025-12-05 12:44:27.373 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:44:27 np0005546954 nova_compute[187160]: 2025-12-05 12:44:27.377 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:44:27 np0005546954 nova_compute[187160]: 2025-12-05 12:44:27.384 187164 DEBUG oslo_concurrency.lockutils [req-53e04f8e-38f2-4d16-b251-fb7984bfc8e5 req-813b8f1f-5f2a-48f6-8606-aa2c7ad5dba6 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Releasing lock "refresh_cache-a133dad5-02c4-4021-90e5-ee9f3322f351" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:44:27 np0005546954 nova_compute[187160]: 2025-12-05 12:44:27.388 187164 DEBUG nova.virt.libvirt.driver [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:44:27 np0005546954 nova_compute[187160]: 2025-12-05 12:44:27.388 187164 DEBUG nova.virt.libvirt.driver [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:44:27 np0005546954 nova_compute[187160]: 2025-12-05 12:44:27.388 187164 DEBUG nova.virt.libvirt.driver [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:44:27 np0005546954 nova_compute[187160]: 2025-12-05 12:44:27.389 187164 DEBUG nova.virt.libvirt.driver [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:44:27 np0005546954 nova_compute[187160]: 2025-12-05 12:44:27.389 187164 DEBUG nova.virt.libvirt.driver [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:44:27 np0005546954 nova_compute[187160]: 2025-12-05 12:44:27.390 187164 DEBUG nova.virt.libvirt.driver [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:44:27 np0005546954 nova_compute[187160]: 2025-12-05 12:44:27.396 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:44:27 np0005546954 nova_compute[187160]: 2025-12-05 12:44:27.396 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764938667.3428223, a133dad5-02c4-4021-90e5-ee9f3322f351 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:44:27 np0005546954 nova_compute[187160]: 2025-12-05 12:44:27.397 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:44:27 np0005546954 nova_compute[187160]: 2025-12-05 12:44:27.424 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:44:27 np0005546954 nova_compute[187160]: 2025-12-05 12:44:27.429 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764938667.3525443, a133dad5-02c4-4021-90e5-ee9f3322f351 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:44:27 np0005546954 nova_compute[187160]: 2025-12-05 12:44:27.429 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:44:27 np0005546954 nova_compute[187160]: 2025-12-05 12:44:27.447 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:44:27 np0005546954 nova_compute[187160]: 2025-12-05 12:44:27.450 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:44:27 np0005546954 nova_compute[187160]: 2025-12-05 12:44:27.457 187164 INFO nova.compute.manager [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Took 8.07 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:44:27 np0005546954 nova_compute[187160]: 2025-12-05 12:44:27.457 187164 DEBUG nova.compute.manager [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:44:27 np0005546954 nova_compute[187160]: 2025-12-05 12:44:27.466 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:44:27 np0005546954 nova_compute[187160]: 2025-12-05 12:44:27.514 187164 INFO nova.compute.manager [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Took 8.77 seconds to build instance.#033[00m
Dec  5 07:44:27 np0005546954 nova_compute[187160]: 2025-12-05 12:44:27.530 187164 DEBUG oslo_concurrency.lockutils [None req-4d7035c2-95d4-4b45-bd8b-a5a54c8a5aae 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "a133dad5-02c4-4021-90e5-ee9f3322f351" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.860s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:44:29 np0005546954 nova_compute[187160]: 2025-12-05 12:44:29.041 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:29 np0005546954 nova_compute[187160]: 2025-12-05 12:44:29.619 187164 DEBUG nova.compute.manager [req-fd95fcba-3e49-45c1-add0-5b6ffa31e6d9 req-38125647-80ac-4bcf-9cb0-40681eb2f5c5 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Received event network-vif-plugged-1f839a4f-9bf4-4b39-aca7-83959475d57e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:44:29 np0005546954 nova_compute[187160]: 2025-12-05 12:44:29.619 187164 DEBUG oslo_concurrency.lockutils [req-fd95fcba-3e49-45c1-add0-5b6ffa31e6d9 req-38125647-80ac-4bcf-9cb0-40681eb2f5c5 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "a133dad5-02c4-4021-90e5-ee9f3322f351-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:44:29 np0005546954 nova_compute[187160]: 2025-12-05 12:44:29.620 187164 DEBUG oslo_concurrency.lockutils [req-fd95fcba-3e49-45c1-add0-5b6ffa31e6d9 req-38125647-80ac-4bcf-9cb0-40681eb2f5c5 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "a133dad5-02c4-4021-90e5-ee9f3322f351-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:44:29 np0005546954 nova_compute[187160]: 2025-12-05 12:44:29.621 187164 DEBUG oslo_concurrency.lockutils [req-fd95fcba-3e49-45c1-add0-5b6ffa31e6d9 req-38125647-80ac-4bcf-9cb0-40681eb2f5c5 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "a133dad5-02c4-4021-90e5-ee9f3322f351-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:44:29 np0005546954 nova_compute[187160]: 2025-12-05 12:44:29.621 187164 DEBUG nova.compute.manager [req-fd95fcba-3e49-45c1-add0-5b6ffa31e6d9 req-38125647-80ac-4bcf-9cb0-40681eb2f5c5 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] No waiting events found dispatching network-vif-plugged-1f839a4f-9bf4-4b39-aca7-83959475d57e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:44:29 np0005546954 nova_compute[187160]: 2025-12-05 12:44:29.622 187164 WARNING nova.compute.manager [req-fd95fcba-3e49-45c1-add0-5b6ffa31e6d9 req-38125647-80ac-4bcf-9cb0-40681eb2f5c5 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Received unexpected event network-vif-plugged-1f839a4f-9bf4-4b39-aca7-83959475d57e for instance with vm_state active and task_state None.#033[00m
Dec  5 07:44:29 np0005546954 nova_compute[187160]: 2025-12-05 12:44:29.693 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:34 np0005546954 nova_compute[187160]: 2025-12-05 12:44:34.044 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:34 np0005546954 nova_compute[187160]: 2025-12-05 12:44:34.695 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:35 np0005546954 podman[197513]: time="2025-12-05T12:44:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:44:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:44:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Dec  5 07:44:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:44:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3034 "" "Go-http-client/1.1"
Dec  5 07:44:36 np0005546954 podman[209313]: 2025-12-05 12:44:36.603312325 +0000 UTC m=+0.096988593 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:44:37 np0005546954 nova_compute[187160]: 2025-12-05 12:44:37.660 187164 DEBUG nova.compute.manager [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Dec  5 07:44:37 np0005546954 nova_compute[187160]: 2025-12-05 12:44:37.733 187164 DEBUG nova.virt.libvirt.driver [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: dde775e5-862e-4f88-b0e9-7d98a681bb3e] Creating tmpfile /var/lib/nova/instances/tmpy21d6ygo to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Dec  5 07:44:37 np0005546954 nova_compute[187160]: 2025-12-05 12:44:37.750 187164 DEBUG oslo_concurrency.lockutils [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:44:37 np0005546954 nova_compute[187160]: 2025-12-05 12:44:37.751 187164 DEBUG oslo_concurrency.lockutils [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:44:37 np0005546954 nova_compute[187160]: 2025-12-05 12:44:37.783 187164 DEBUG nova.objects.instance [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lazy-loading 'pci_requests' on Instance uuid bd45ae9f-9649-4347-a5e4-658d02804ef9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:44:37 np0005546954 nova_compute[187160]: 2025-12-05 12:44:37.803 187164 DEBUG nova.virt.hardware [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:44:37 np0005546954 nova_compute[187160]: 2025-12-05 12:44:37.803 187164 INFO nova.compute.claims [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Claim successful on node compute-1.ctlplane.example.com#033[00m
Dec  5 07:44:37 np0005546954 nova_compute[187160]: 2025-12-05 12:44:37.804 187164 DEBUG nova.objects.instance [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lazy-loading 'resources' on Instance uuid bd45ae9f-9649-4347-a5e4-658d02804ef9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:44:37 np0005546954 nova_compute[187160]: 2025-12-05 12:44:37.825 187164 DEBUG nova.objects.instance [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lazy-loading 'numa_topology' on Instance uuid bd45ae9f-9649-4347-a5e4-658d02804ef9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:44:37 np0005546954 nova_compute[187160]: 2025-12-05 12:44:37.850 187164 DEBUG nova.objects.instance [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lazy-loading 'pci_devices' on Instance uuid bd45ae9f-9649-4347-a5e4-658d02804ef9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:44:37 np0005546954 nova_compute[187160]: 2025-12-05 12:44:37.868 187164 DEBUG nova.compute.manager [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpy21d6ygo',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Dec  5 07:44:37 np0005546954 nova_compute[187160]: 2025-12-05 12:44:37.886 187164 DEBUG oslo_concurrency.lockutils [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:44:37 np0005546954 nova_compute[187160]: 2025-12-05 12:44:37.886 187164 DEBUG oslo_concurrency.lockutils [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:44:37 np0005546954 nova_compute[187160]: 2025-12-05 12:44:37.896 187164 INFO nova.compute.rpcapi [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66#033[00m
Dec  5 07:44:37 np0005546954 nova_compute[187160]: 2025-12-05 12:44:37.897 187164 DEBUG oslo_concurrency.lockutils [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:44:37 np0005546954 nova_compute[187160]: 2025-12-05 12:44:37.901 187164 INFO nova.compute.resource_tracker [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Updating resource usage from migration a45b1cf8-9c77-4611-bf5f-26795e8b6e03#033[00m
Dec  5 07:44:37 np0005546954 nova_compute[187160]: 2025-12-05 12:44:37.901 187164 DEBUG nova.compute.resource_tracker [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Starting to track incoming migration a45b1cf8-9c77-4611-bf5f-26795e8b6e03 with flavor b4ea63be-97f8-4a48-b000-66321c4ddb27 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Dec  5 07:44:38 np0005546954 nova_compute[187160]: 2025-12-05 12:44:38.052 187164 DEBUG nova.compute.provider_tree [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:44:38 np0005546954 nova_compute[187160]: 2025-12-05 12:44:38.070 187164 DEBUG nova.scheduler.client.report [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:44:38 np0005546954 nova_compute[187160]: 2025-12-05 12:44:38.101 187164 DEBUG oslo_concurrency.lockutils [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.350s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:44:38 np0005546954 nova_compute[187160]: 2025-12-05 12:44:38.102 187164 INFO nova.compute.manager [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Migrating#033[00m
Dec  5 07:44:38 np0005546954 nova_compute[187160]: 2025-12-05 12:44:38.975 187164 DEBUG nova.compute.manager [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpy21d6ygo',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='dde775e5-862e-4f88-b0e9-7d98a681bb3e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Dec  5 07:44:39 np0005546954 nova_compute[187160]: 2025-12-05 12:44:39.025 187164 DEBUG oslo_concurrency.lockutils [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "refresh_cache-dde775e5-862e-4f88-b0e9-7d98a681bb3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:44:39 np0005546954 nova_compute[187160]: 2025-12-05 12:44:39.026 187164 DEBUG oslo_concurrency.lockutils [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquired lock "refresh_cache-dde775e5-862e-4f88-b0e9-7d98a681bb3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:44:39 np0005546954 nova_compute[187160]: 2025-12-05 12:44:39.026 187164 DEBUG nova.network.neutron [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: dde775e5-862e-4f88-b0e9-7d98a681bb3e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:44:39 np0005546954 nova_compute[187160]: 2025-12-05 12:44:39.046 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:39 np0005546954 nova_compute[187160]: 2025-12-05 12:44:39.698 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:39 np0005546954 systemd[1]: Created slice User Slice of UID 42436.
Dec  5 07:44:39 np0005546954 systemd[1]: Starting User Runtime Directory /run/user/42436...
Dec  5 07:44:39 np0005546954 systemd-logind[789]: New session 28 of user nova.
Dec  5 07:44:39 np0005546954 systemd[1]: Finished User Runtime Directory /run/user/42436.
Dec  5 07:44:39 np0005546954 systemd[1]: Starting User Manager for UID 42436...
Dec  5 07:44:40 np0005546954 ovn_controller[95566]: 2025-12-05T12:44:40Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:bd:0e:16 10.100.0.7
Dec  5 07:44:40 np0005546954 ovn_controller[95566]: 2025-12-05T12:44:40Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:bd:0e:16 10.100.0.7
Dec  5 07:44:40 np0005546954 systemd[209350]: Queued start job for default target Main User Target.
Dec  5 07:44:40 np0005546954 systemd[209350]: Created slice User Application Slice.
Dec  5 07:44:40 np0005546954 systemd[209350]: Started Mark boot as successful after the user session has run 2 minutes.
Dec  5 07:44:40 np0005546954 systemd[209350]: Started Daily Cleanup of User's Temporary Directories.
Dec  5 07:44:40 np0005546954 systemd[209350]: Reached target Paths.
Dec  5 07:44:40 np0005546954 systemd[209350]: Reached target Timers.
Dec  5 07:44:40 np0005546954 systemd[209350]: Starting D-Bus User Message Bus Socket...
Dec  5 07:44:40 np0005546954 systemd[209350]: Starting Create User's Volatile Files and Directories...
Dec  5 07:44:40 np0005546954 systemd[209350]: Finished Create User's Volatile Files and Directories.
Dec  5 07:44:40 np0005546954 systemd[209350]: Listening on D-Bus User Message Bus Socket.
Dec  5 07:44:40 np0005546954 systemd[209350]: Reached target Sockets.
Dec  5 07:44:40 np0005546954 systemd[209350]: Reached target Basic System.
Dec  5 07:44:40 np0005546954 systemd[209350]: Reached target Main User Target.
Dec  5 07:44:40 np0005546954 systemd[209350]: Startup finished in 162ms.
Dec  5 07:44:40 np0005546954 systemd[1]: Started User Manager for UID 42436.
Dec  5 07:44:40 np0005546954 nova_compute[187160]: 2025-12-05 12:44:40.178 187164 DEBUG nova.network.neutron [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: dde775e5-862e-4f88-b0e9-7d98a681bb3e] Updating instance_info_cache with network_info: [{"id": "79860104-80d8-4998-9c9d-057e3c980d6e", "address": "fa:16:3e:e8:c4:c3", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap79860104-80", "ovs_interfaceid": "79860104-80d8-4998-9c9d-057e3c980d6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:44:40 np0005546954 systemd[1]: Started Session 28 of User nova.
Dec  5 07:44:40 np0005546954 nova_compute[187160]: 2025-12-05 12:44:40.198 187164 DEBUG oslo_concurrency.lockutils [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Releasing lock "refresh_cache-dde775e5-862e-4f88-b0e9-7d98a681bb3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:44:40 np0005546954 nova_compute[187160]: 2025-12-05 12:44:40.200 187164 DEBUG nova.virt.libvirt.driver [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: dde775e5-862e-4f88-b0e9-7d98a681bb3e] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpy21d6ygo',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='dde775e5-862e-4f88-b0e9-7d98a681bb3e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Dec  5 07:44:40 np0005546954 nova_compute[187160]: 2025-12-05 12:44:40.200 187164 DEBUG nova.virt.libvirt.driver [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: dde775e5-862e-4f88-b0e9-7d98a681bb3e] Creating instance directory: /var/lib/nova/instances/dde775e5-862e-4f88-b0e9-7d98a681bb3e pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Dec  5 07:44:40 np0005546954 nova_compute[187160]: 2025-12-05 12:44:40.201 187164 DEBUG nova.virt.libvirt.driver [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: dde775e5-862e-4f88-b0e9-7d98a681bb3e] Creating disk.info with the contents: {'/var/lib/nova/instances/dde775e5-862e-4f88-b0e9-7d98a681bb3e/disk': 'qcow2', '/var/lib/nova/instances/dde775e5-862e-4f88-b0e9-7d98a681bb3e/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Dec  5 07:44:40 np0005546954 nova_compute[187160]: 2025-12-05 12:44:40.202 187164 DEBUG nova.virt.libvirt.driver [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: dde775e5-862e-4f88-b0e9-7d98a681bb3e] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Dec  5 07:44:40 np0005546954 nova_compute[187160]: 2025-12-05 12:44:40.202 187164 DEBUG nova.objects.instance [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lazy-loading 'trusted_certs' on Instance uuid dde775e5-862e-4f88-b0e9-7d98a681bb3e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:44:40 np0005546954 nova_compute[187160]: 2025-12-05 12:44:40.225 187164 DEBUG oslo_concurrency.processutils [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:44:40 np0005546954 systemd[1]: session-28.scope: Deactivated successfully.
Dec  5 07:44:40 np0005546954 systemd-logind[789]: Session 28 logged out. Waiting for processes to exit.
Dec  5 07:44:40 np0005546954 systemd-logind[789]: Removed session 28.
Dec  5 07:44:40 np0005546954 nova_compute[187160]: 2025-12-05 12:44:40.319 187164 DEBUG oslo_concurrency.processutils [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:44:40 np0005546954 nova_compute[187160]: 2025-12-05 12:44:40.321 187164 DEBUG oslo_concurrency.lockutils [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:44:40 np0005546954 nova_compute[187160]: 2025-12-05 12:44:40.322 187164 DEBUG oslo_concurrency.lockutils [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:44:40 np0005546954 nova_compute[187160]: 2025-12-05 12:44:40.333 187164 DEBUG oslo_concurrency.processutils [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:44:40 np0005546954 systemd-logind[789]: New session 30 of user nova.
Dec  5 07:44:40 np0005546954 nova_compute[187160]: 2025-12-05 12:44:40.404 187164 DEBUG oslo_concurrency.processutils [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:44:40 np0005546954 nova_compute[187160]: 2025-12-05 12:44:40.406 187164 DEBUG oslo_concurrency.processutils [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/dde775e5-862e-4f88-b0e9-7d98a681bb3e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:44:40 np0005546954 systemd[1]: Started Session 30 of User nova.
Dec  5 07:44:40 np0005546954 nova_compute[187160]: 2025-12-05 12:44:40.450 187164 DEBUG oslo_concurrency.processutils [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/dde775e5-862e-4f88-b0e9-7d98a681bb3e/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:44:40 np0005546954 nova_compute[187160]: 2025-12-05 12:44:40.451 187164 DEBUG oslo_concurrency.lockutils [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:44:40 np0005546954 nova_compute[187160]: 2025-12-05 12:44:40.452 187164 DEBUG oslo_concurrency.processutils [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:44:40 np0005546954 systemd[1]: session-30.scope: Deactivated successfully.
Dec  5 07:44:40 np0005546954 systemd-logind[789]: Session 30 logged out. Waiting for processes to exit.
Dec  5 07:44:40 np0005546954 systemd-logind[789]: Removed session 30.
Dec  5 07:44:40 np0005546954 nova_compute[187160]: 2025-12-05 12:44:40.519 187164 DEBUG oslo_concurrency.processutils [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:44:40 np0005546954 nova_compute[187160]: 2025-12-05 12:44:40.521 187164 DEBUG nova.virt.disk.api [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Checking if we can resize image /var/lib/nova/instances/dde775e5-862e-4f88-b0e9-7d98a681bb3e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:44:40 np0005546954 nova_compute[187160]: 2025-12-05 12:44:40.521 187164 DEBUG oslo_concurrency.processutils [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dde775e5-862e-4f88-b0e9-7d98a681bb3e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:44:40 np0005546954 nova_compute[187160]: 2025-12-05 12:44:40.583 187164 DEBUG oslo_concurrency.processutils [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dde775e5-862e-4f88-b0e9-7d98a681bb3e/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:44:40 np0005546954 nova_compute[187160]: 2025-12-05 12:44:40.584 187164 DEBUG nova.virt.disk.api [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Cannot resize image /var/lib/nova/instances/dde775e5-862e-4f88-b0e9-7d98a681bb3e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:44:40 np0005546954 nova_compute[187160]: 2025-12-05 12:44:40.584 187164 DEBUG nova.objects.instance [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lazy-loading 'migration_context' on Instance uuid dde775e5-862e-4f88-b0e9-7d98a681bb3e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:44:40 np0005546954 nova_compute[187160]: 2025-12-05 12:44:40.601 187164 DEBUG oslo_concurrency.processutils [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/dde775e5-862e-4f88-b0e9-7d98a681bb3e/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:44:40 np0005546954 nova_compute[187160]: 2025-12-05 12:44:40.626 187164 DEBUG oslo_concurrency.processutils [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/dde775e5-862e-4f88-b0e9-7d98a681bb3e/disk.config 485376" returned: 0 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:44:40 np0005546954 nova_compute[187160]: 2025-12-05 12:44:40.630 187164 DEBUG nova.virt.libvirt.volume.remotefs [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/dde775e5-862e-4f88-b0e9-7d98a681bb3e/disk.config to /var/lib/nova/instances/dde775e5-862e-4f88-b0e9-7d98a681bb3e copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Dec  5 07:44:40 np0005546954 nova_compute[187160]: 2025-12-05 12:44:40.630 187164 DEBUG oslo_concurrency.processutils [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/dde775e5-862e-4f88-b0e9-7d98a681bb3e/disk.config /var/lib/nova/instances/dde775e5-862e-4f88-b0e9-7d98a681bb3e execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:44:41 np0005546954 nova_compute[187160]: 2025-12-05 12:44:41.121 187164 DEBUG oslo_concurrency.processutils [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/dde775e5-862e-4f88-b0e9-7d98a681bb3e/disk.config /var/lib/nova/instances/dde775e5-862e-4f88-b0e9-7d98a681bb3e" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:44:41 np0005546954 nova_compute[187160]: 2025-12-05 12:44:41.122 187164 DEBUG nova.virt.libvirt.driver [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: dde775e5-862e-4f88-b0e9-7d98a681bb3e] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Dec  5 07:44:41 np0005546954 nova_compute[187160]: 2025-12-05 12:44:41.123 187164 DEBUG nova.virt.libvirt.vif [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:43:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-364941227',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-364941227',id=3,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:43:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6b5f383ed0484ca1bde081bf623dad4b',ramdisk_id='',reservation_id='r-ewuwj2i6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1570363089',owner_user_name='tempest-TestExecuteActionsViaActuator-1570363089-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:43:39Z,user_data=None,user_id='7ce7ef64754e4a32b4af3272e31a4a5e',uuid=dde775e5-862e-4f88-b0e9-7d98a681bb3e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "79860104-80d8-4998-9c9d-057e3c980d6e", "address": "fa:16:3e:e8:c4:c3", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap79860104-80", "ovs_interfaceid": "79860104-80d8-4998-9c9d-057e3c980d6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:44:41 np0005546954 nova_compute[187160]: 2025-12-05 12:44:41.124 187164 DEBUG nova.network.os_vif_util [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Converting VIF {"id": "79860104-80d8-4998-9c9d-057e3c980d6e", "address": "fa:16:3e:e8:c4:c3", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap79860104-80", "ovs_interfaceid": "79860104-80d8-4998-9c9d-057e3c980d6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:44:41 np0005546954 nova_compute[187160]: 2025-12-05 12:44:41.125 187164 DEBUG nova.network.os_vif_util [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e8:c4:c3,bridge_name='br-int',has_traffic_filtering=True,id=79860104-80d8-4998-9c9d-057e3c980d6e,network=Network(ee43e901-b158-4dc0-894f-2384aef8b277),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap79860104-80') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:44:41 np0005546954 nova_compute[187160]: 2025-12-05 12:44:41.125 187164 DEBUG os_vif [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e8:c4:c3,bridge_name='br-int',has_traffic_filtering=True,id=79860104-80d8-4998-9c9d-057e3c980d6e,network=Network(ee43e901-b158-4dc0-894f-2384aef8b277),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap79860104-80') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:44:41 np0005546954 nova_compute[187160]: 2025-12-05 12:44:41.126 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:41 np0005546954 nova_compute[187160]: 2025-12-05 12:44:41.126 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:44:41 np0005546954 nova_compute[187160]: 2025-12-05 12:44:41.127 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:44:41 np0005546954 nova_compute[187160]: 2025-12-05 12:44:41.131 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:41 np0005546954 nova_compute[187160]: 2025-12-05 12:44:41.132 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap79860104-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:44:41 np0005546954 nova_compute[187160]: 2025-12-05 12:44:41.132 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap79860104-80, col_values=(('external_ids', {'iface-id': '79860104-80d8-4998-9c9d-057e3c980d6e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e8:c4:c3', 'vm-uuid': 'dde775e5-862e-4f88-b0e9-7d98a681bb3e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:44:41 np0005546954 nova_compute[187160]: 2025-12-05 12:44:41.134 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:41 np0005546954 NetworkManager[55665]: <info>  [1764938681.1360] manager: (tap79860104-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/28)
Dec  5 07:44:41 np0005546954 nova_compute[187160]: 2025-12-05 12:44:41.138 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:44:41 np0005546954 nova_compute[187160]: 2025-12-05 12:44:41.145 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:41 np0005546954 nova_compute[187160]: 2025-12-05 12:44:41.146 187164 INFO os_vif [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e8:c4:c3,bridge_name='br-int',has_traffic_filtering=True,id=79860104-80d8-4998-9c9d-057e3c980d6e,network=Network(ee43e901-b158-4dc0-894f-2384aef8b277),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap79860104-80')#033[00m
Dec  5 07:44:41 np0005546954 nova_compute[187160]: 2025-12-05 12:44:41.146 187164 DEBUG nova.virt.libvirt.driver [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Dec  5 07:44:41 np0005546954 nova_compute[187160]: 2025-12-05 12:44:41.147 187164 DEBUG nova.compute.manager [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpy21d6ygo',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='dde775e5-862e-4f88-b0e9-7d98a681bb3e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Dec  5 07:44:41 np0005546954 podman[209395]: 2025-12-05 12:44:41.629054271 +0000 UTC m=+0.110105233 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 07:44:42 np0005546954 nova_compute[187160]: 2025-12-05 12:44:42.405 187164 DEBUG nova.network.neutron [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: dde775e5-862e-4f88-b0e9-7d98a681bb3e] Port 79860104-80d8-4998-9c9d-057e3c980d6e updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Dec  5 07:44:42 np0005546954 nova_compute[187160]: 2025-12-05 12:44:42.406 187164 DEBUG nova.compute.manager [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpy21d6ygo',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='dde775e5-862e-4f88-b0e9-7d98a681bb3e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Dec  5 07:44:42 np0005546954 systemd[1]: Starting libvirt proxy daemon...
Dec  5 07:44:42 np0005546954 systemd[1]: Started libvirt proxy daemon.
Dec  5 07:44:42 np0005546954 kernel: tap79860104-80: entered promiscuous mode
Dec  5 07:44:42 np0005546954 NetworkManager[55665]: <info>  [1764938682.7163] manager: (tap79860104-80): new Tun device (/org/freedesktop/NetworkManager/Devices/29)
Dec  5 07:44:42 np0005546954 ovn_controller[95566]: 2025-12-05T12:44:42Z|00041|binding|INFO|Claiming lport 79860104-80d8-4998-9c9d-057e3c980d6e for this additional chassis.
Dec  5 07:44:42 np0005546954 ovn_controller[95566]: 2025-12-05T12:44:42Z|00042|binding|INFO|79860104-80d8-4998-9c9d-057e3c980d6e: Claiming fa:16:3e:e8:c4:c3 10.100.0.5
Dec  5 07:44:42 np0005546954 nova_compute[187160]: 2025-12-05 12:44:42.716 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:42 np0005546954 ovn_controller[95566]: 2025-12-05T12:44:42Z|00043|binding|INFO|Setting lport 79860104-80d8-4998-9c9d-057e3c980d6e ovn-installed in OVS
Dec  5 07:44:42 np0005546954 nova_compute[187160]: 2025-12-05 12:44:42.744 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:42 np0005546954 nova_compute[187160]: 2025-12-05 12:44:42.749 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:42 np0005546954 systemd-udevd[209474]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:44:42 np0005546954 NetworkManager[55665]: <info>  [1764938682.7797] device (tap79860104-80): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:44:42 np0005546954 NetworkManager[55665]: <info>  [1764938682.7803] device (tap79860104-80): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:44:42 np0005546954 systemd-machined[153497]: New machine qemu-4-instance-00000003.
Dec  5 07:44:42 np0005546954 systemd[1]: Started Virtual Machine qemu-4-instance-00000003.
Dec  5 07:44:42 np0005546954 podman[209441]: 2025-12-05 12:44:42.849428186 +0000 UTC m=+0.176507768 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec  5 07:44:43 np0005546954 nova_compute[187160]: 2025-12-05 12:44:43.384 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764938683.3839302, dde775e5-862e-4f88-b0e9-7d98a681bb3e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:44:43 np0005546954 nova_compute[187160]: 2025-12-05 12:44:43.385 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: dde775e5-862e-4f88-b0e9-7d98a681bb3e] VM Started (Lifecycle Event)#033[00m
Dec  5 07:44:43 np0005546954 systemd-logind[789]: New session 31 of user nova.
Dec  5 07:44:43 np0005546954 systemd[1]: Started Session 31 of User nova.
Dec  5 07:44:43 np0005546954 nova_compute[187160]: 2025-12-05 12:44:43.871 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: dde775e5-862e-4f88-b0e9-7d98a681bb3e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:44:44 np0005546954 nova_compute[187160]: 2025-12-05 12:44:44.004 187164 DEBUG nova.compute.manager [req-30b4ce97-7fab-4bb1-b12e-e5d4469a666d req-2792a36d-5267-43e1-99ae-bd90590b705e 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Received event network-vif-unplugged-655d2088-f140-4abd-a02f-295b2208fecf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:44:44 np0005546954 nova_compute[187160]: 2025-12-05 12:44:44.005 187164 DEBUG oslo_concurrency.lockutils [req-30b4ce97-7fab-4bb1-b12e-e5d4469a666d req-2792a36d-5267-43e1-99ae-bd90590b705e 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "bd45ae9f-9649-4347-a5e4-658d02804ef9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:44:44 np0005546954 nova_compute[187160]: 2025-12-05 12:44:44.005 187164 DEBUG oslo_concurrency.lockutils [req-30b4ce97-7fab-4bb1-b12e-e5d4469a666d req-2792a36d-5267-43e1-99ae-bd90590b705e 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "bd45ae9f-9649-4347-a5e4-658d02804ef9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:44:44 np0005546954 nova_compute[187160]: 2025-12-05 12:44:44.006 187164 DEBUG oslo_concurrency.lockutils [req-30b4ce97-7fab-4bb1-b12e-e5d4469a666d req-2792a36d-5267-43e1-99ae-bd90590b705e 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "bd45ae9f-9649-4347-a5e4-658d02804ef9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:44:44 np0005546954 nova_compute[187160]: 2025-12-05 12:44:44.006 187164 DEBUG nova.compute.manager [req-30b4ce97-7fab-4bb1-b12e-e5d4469a666d req-2792a36d-5267-43e1-99ae-bd90590b705e 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] No waiting events found dispatching network-vif-unplugged-655d2088-f140-4abd-a02f-295b2208fecf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:44:44 np0005546954 nova_compute[187160]: 2025-12-05 12:44:44.006 187164 WARNING nova.compute.manager [req-30b4ce97-7fab-4bb1-b12e-e5d4469a666d req-2792a36d-5267-43e1-99ae-bd90590b705e 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Received unexpected event network-vif-unplugged-655d2088-f140-4abd-a02f-295b2208fecf for instance with vm_state active and task_state resize_migrating.#033[00m
Dec  5 07:44:44 np0005546954 nova_compute[187160]: 2025-12-05 12:44:44.048 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:44 np0005546954 systemd[1]: session-31.scope: Deactivated successfully.
Dec  5 07:44:44 np0005546954 systemd-logind[789]: Session 31 logged out. Waiting for processes to exit.
Dec  5 07:44:44 np0005546954 systemd-logind[789]: Removed session 31.
Dec  5 07:44:44 np0005546954 systemd-logind[789]: New session 32 of user nova.
Dec  5 07:44:44 np0005546954 systemd[1]: Started Session 32 of User nova.
Dec  5 07:44:44 np0005546954 nova_compute[187160]: 2025-12-05 12:44:44.568 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764938684.567557, dde775e5-862e-4f88-b0e9-7d98a681bb3e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:44:44 np0005546954 nova_compute[187160]: 2025-12-05 12:44:44.568 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: dde775e5-862e-4f88-b0e9-7d98a681bb3e] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:44:44 np0005546954 nova_compute[187160]: 2025-12-05 12:44:44.585 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: dde775e5-862e-4f88-b0e9-7d98a681bb3e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:44:44 np0005546954 nova_compute[187160]: 2025-12-05 12:44:44.590 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: dde775e5-862e-4f88-b0e9-7d98a681bb3e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:44:44 np0005546954 systemd[1]: session-32.scope: Deactivated successfully.
Dec  5 07:44:44 np0005546954 systemd-logind[789]: Session 32 logged out. Waiting for processes to exit.
Dec  5 07:44:44 np0005546954 nova_compute[187160]: 2025-12-05 12:44:44.611 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: dde775e5-862e-4f88-b0e9-7d98a681bb3e] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Dec  5 07:44:44 np0005546954 systemd-logind[789]: Removed session 32.
Dec  5 07:44:44 np0005546954 systemd-logind[789]: New session 33 of user nova.
Dec  5 07:44:44 np0005546954 systemd[1]: Started Session 33 of User nova.
Dec  5 07:44:44 np0005546954 systemd[1]: session-33.scope: Deactivated successfully.
Dec  5 07:44:44 np0005546954 systemd-logind[789]: Session 33 logged out. Waiting for processes to exit.
Dec  5 07:44:44 np0005546954 systemd-logind[789]: Removed session 33.
Dec  5 07:44:46 np0005546954 nova_compute[187160]: 2025-12-05 12:44:46.137 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:46 np0005546954 nova_compute[187160]: 2025-12-05 12:44:46.292 187164 DEBUG nova.compute.manager [req-82017575-58d4-4636-9c73-46a54a75ae84 req-fc27ff71-c583-4bfb-8fa2-d4df6a5d9b41 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Received event network-vif-plugged-655d2088-f140-4abd-a02f-295b2208fecf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:44:46 np0005546954 nova_compute[187160]: 2025-12-05 12:44:46.293 187164 DEBUG oslo_concurrency.lockutils [req-82017575-58d4-4636-9c73-46a54a75ae84 req-fc27ff71-c583-4bfb-8fa2-d4df6a5d9b41 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "bd45ae9f-9649-4347-a5e4-658d02804ef9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:44:46 np0005546954 nova_compute[187160]: 2025-12-05 12:44:46.293 187164 DEBUG oslo_concurrency.lockutils [req-82017575-58d4-4636-9c73-46a54a75ae84 req-fc27ff71-c583-4bfb-8fa2-d4df6a5d9b41 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "bd45ae9f-9649-4347-a5e4-658d02804ef9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:44:46 np0005546954 nova_compute[187160]: 2025-12-05 12:44:46.293 187164 DEBUG oslo_concurrency.lockutils [req-82017575-58d4-4636-9c73-46a54a75ae84 req-fc27ff71-c583-4bfb-8fa2-d4df6a5d9b41 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "bd45ae9f-9649-4347-a5e4-658d02804ef9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:44:46 np0005546954 nova_compute[187160]: 2025-12-05 12:44:46.294 187164 DEBUG nova.compute.manager [req-82017575-58d4-4636-9c73-46a54a75ae84 req-fc27ff71-c583-4bfb-8fa2-d4df6a5d9b41 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] No waiting events found dispatching network-vif-plugged-655d2088-f140-4abd-a02f-295b2208fecf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:44:46 np0005546954 nova_compute[187160]: 2025-12-05 12:44:46.294 187164 WARNING nova.compute.manager [req-82017575-58d4-4636-9c73-46a54a75ae84 req-fc27ff71-c583-4bfb-8fa2-d4df6a5d9b41 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Received unexpected event network-vif-plugged-655d2088-f140-4abd-a02f-295b2208fecf for instance with vm_state active and task_state resize_migrating.#033[00m
Dec  5 07:44:47 np0005546954 nova_compute[187160]: 2025-12-05 12:44:47.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:44:47 np0005546954 nova_compute[187160]: 2025-12-05 12:44:47.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  5 07:44:47 np0005546954 nova_compute[187160]: 2025-12-05 12:44:47.057 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  5 07:44:47 np0005546954 nova_compute[187160]: 2025-12-05 12:44:47.370 187164 INFO nova.network.neutron [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Updating port 655d2088-f140-4abd-a02f-295b2208fecf with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Dec  5 07:44:47 np0005546954 ovn_controller[95566]: 2025-12-05T12:44:47Z|00044|binding|INFO|Claiming lport 79860104-80d8-4998-9c9d-057e3c980d6e for this chassis.
Dec  5 07:44:47 np0005546954 ovn_controller[95566]: 2025-12-05T12:44:47Z|00045|binding|INFO|79860104-80d8-4998-9c9d-057e3c980d6e: Claiming fa:16:3e:e8:c4:c3 10.100.0.5
Dec  5 07:44:47 np0005546954 ovn_controller[95566]: 2025-12-05T12:44:47Z|00046|binding|INFO|Setting lport 79860104-80d8-4998-9c9d-057e3c980d6e up in Southbound
Dec  5 07:44:47 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:47.471 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e8:c4:c3 10.100.0.5'], port_security=['fa:16:3e:e8:c4:c3 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'dde775e5-862e-4f88-b0e9-7d98a681bb3e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee43e901-b158-4dc0-894f-2384aef8b277', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6b5f383ed0484ca1bde081bf623dad4b', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'a865bea6-413e-4ecb-bace-2ec9005935f1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8c84cb49-52df-48a9-8d24-aff5b642e12a, chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=79860104-80d8-4998-9c9d-057e3c980d6e) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:44:47 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:47.476 104428 INFO neutron.agent.ovn.metadata.agent [-] Port 79860104-80d8-4998-9c9d-057e3c980d6e in datapath ee43e901-b158-4dc0-894f-2384aef8b277 bound to our chassis#033[00m
Dec  5 07:44:47 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:47.481 104428 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee43e901-b158-4dc0-894f-2384aef8b277#033[00m
Dec  5 07:44:47 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:47.514 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[8a67e5d6-8b8b-43e5-9c9a-10c95cd14699]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:44:47 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:47.563 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[a3708f9c-633c-486b-8e73-c326ca386cbc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:44:47 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:47.569 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[1e5a868e-5a15-48e5-9ce1-660786b45ea0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:44:47 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:47.615 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[0889afa8-2ff1-41b9-9131-7268a072c09f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:44:47 np0005546954 nova_compute[187160]: 2025-12-05 12:44:47.627 187164 INFO nova.compute.manager [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: dde775e5-862e-4f88-b0e9-7d98a681bb3e] Post operation of migration started#033[00m
Dec  5 07:44:47 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:47.639 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[98bbb1ae-8468-4d81-b1d2-a2085b4f04c4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee43e901-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:16:af:4f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 17, 'tx_packets': 9, 'rx_bytes': 994, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 17, 'tx_packets': 9, 'rx_bytes': 994, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 358945, 'reachable_time': 17578, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209530, 'error': None, 'target': 'ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:44:47 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:47.667 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[b38c38e2-f22e-436a-9575-e539df02bdd1]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapee43e901-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 358963, 'tstamp': 358963}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209531, 'error': None, 'target': 'ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapee43e901-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 358968, 'tstamp': 358968}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209531, 'error': None, 'target': 'ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:44:47 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:47.669 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee43e901-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:44:47 np0005546954 nova_compute[187160]: 2025-12-05 12:44:47.725 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:47 np0005546954 nova_compute[187160]: 2025-12-05 12:44:47.727 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:47 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:47.728 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee43e901-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:44:47 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:47.728 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:44:47 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:47.729 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee43e901-b0, col_values=(('external_ids', {'iface-id': 'ff42a43f-b4ac-4be3-b747-f3c0a6e67328'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:44:47 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:47.730 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:44:47 np0005546954 nova_compute[187160]: 2025-12-05 12:44:47.945 187164 DEBUG oslo_concurrency.lockutils [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "refresh_cache-dde775e5-862e-4f88-b0e9-7d98a681bb3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:44:47 np0005546954 nova_compute[187160]: 2025-12-05 12:44:47.946 187164 DEBUG oslo_concurrency.lockutils [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquired lock "refresh_cache-dde775e5-862e-4f88-b0e9-7d98a681bb3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:44:47 np0005546954 nova_compute[187160]: 2025-12-05 12:44:47.947 187164 DEBUG nova.network.neutron [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: dde775e5-862e-4f88-b0e9-7d98a681bb3e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:44:48 np0005546954 nova_compute[187160]: 2025-12-05 12:44:48.213 187164 DEBUG oslo_concurrency.lockutils [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "refresh_cache-bd45ae9f-9649-4347-a5e4-658d02804ef9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:44:48 np0005546954 nova_compute[187160]: 2025-12-05 12:44:48.214 187164 DEBUG oslo_concurrency.lockutils [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquired lock "refresh_cache-bd45ae9f-9649-4347-a5e4-658d02804ef9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:44:48 np0005546954 nova_compute[187160]: 2025-12-05 12:44:48.214 187164 DEBUG nova.network.neutron [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:44:49 np0005546954 nova_compute[187160]: 2025-12-05 12:44:49.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:44:49 np0005546954 nova_compute[187160]: 2025-12-05 12:44:49.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  5 07:44:49 np0005546954 nova_compute[187160]: 2025-12-05 12:44:49.051 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:49 np0005546954 nova_compute[187160]: 2025-12-05 12:44:49.317 187164 DEBUG nova.network.neutron [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: dde775e5-862e-4f88-b0e9-7d98a681bb3e] Updating instance_info_cache with network_info: [{"id": "79860104-80d8-4998-9c9d-057e3c980d6e", "address": "fa:16:3e:e8:c4:c3", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap79860104-80", "ovs_interfaceid": "79860104-80d8-4998-9c9d-057e3c980d6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:44:49 np0005546954 nova_compute[187160]: 2025-12-05 12:44:49.343 187164 DEBUG oslo_concurrency.lockutils [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Releasing lock "refresh_cache-dde775e5-862e-4f88-b0e9-7d98a681bb3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:44:49 np0005546954 nova_compute[187160]: 2025-12-05 12:44:49.362 187164 DEBUG oslo_concurrency.lockutils [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:44:49 np0005546954 nova_compute[187160]: 2025-12-05 12:44:49.363 187164 DEBUG oslo_concurrency.lockutils [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:44:49 np0005546954 nova_compute[187160]: 2025-12-05 12:44:49.363 187164 DEBUG oslo_concurrency.lockutils [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:44:49 np0005546954 nova_compute[187160]: 2025-12-05 12:44:49.367 187164 INFO nova.virt.libvirt.driver [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: dde775e5-862e-4f88-b0e9-7d98a681bb3e] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Dec  5 07:44:49 np0005546954 virtqemud[186730]: Domain id=4 name='instance-00000003' uuid=dde775e5-862e-4f88-b0e9-7d98a681bb3e is tainted: custom-monitor
Dec  5 07:44:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:44:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 07:44:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:44:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:44:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:44:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:44:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:44:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 07:44:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:44:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:44:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 07:44:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:44:50 np0005546954 nova_compute[187160]: 2025-12-05 12:44:50.149 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:44:50 np0005546954 nova_compute[187160]: 2025-12-05 12:44:50.377 187164 INFO nova.virt.libvirt.driver [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: dde775e5-862e-4f88-b0e9-7d98a681bb3e] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Dec  5 07:44:50 np0005546954 nova_compute[187160]: 2025-12-05 12:44:50.991 187164 DEBUG nova.network.neutron [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Updating instance_info_cache with network_info: [{"id": "655d2088-f140-4abd-a02f-295b2208fecf", "address": "fa:16:3e:5e:21:c0", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap655d2088-f1", "ovs_interfaceid": "655d2088-f140-4abd-a02f-295b2208fecf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.041 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.041 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.140 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.308 187164 DEBUG oslo_concurrency.lockutils [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Releasing lock "refresh_cache-bd45ae9f-9649-4347-a5e4-658d02804ef9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.319 187164 DEBUG nova.compute.manager [req-9037f59d-04d1-4ee6-8938-f7b17b3047d0 req-612ee2fe-07fc-44a4-846d-b4cf069b0dd3 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Received event network-changed-655d2088-f140-4abd-a02f-295b2208fecf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.320 187164 DEBUG nova.compute.manager [req-9037f59d-04d1-4ee6-8938-f7b17b3047d0 req-612ee2fe-07fc-44a4-846d-b4cf069b0dd3 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Refreshing instance network info cache due to event network-changed-655d2088-f140-4abd-a02f-295b2208fecf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.321 187164 DEBUG oslo_concurrency.lockutils [req-9037f59d-04d1-4ee6-8938-f7b17b3047d0 req-612ee2fe-07fc-44a4-846d-b4cf069b0dd3 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "refresh_cache-bd45ae9f-9649-4347-a5e4-658d02804ef9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.321 187164 DEBUG oslo_concurrency.lockutils [req-9037f59d-04d1-4ee6-8938-f7b17b3047d0 req-612ee2fe-07fc-44a4-846d-b4cf069b0dd3 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquired lock "refresh_cache-bd45ae9f-9649-4347-a5e4-658d02804ef9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.322 187164 DEBUG nova.network.neutron [req-9037f59d-04d1-4ee6-8938-f7b17b3047d0 req-612ee2fe-07fc-44a4-846d-b4cf069b0dd3 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Refreshing network info cache for port 655d2088-f140-4abd-a02f-295b2208fecf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.385 187164 INFO nova.virt.libvirt.driver [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: dde775e5-862e-4f88-b0e9-7d98a681bb3e] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.392 187164 DEBUG nova.compute.manager [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: dde775e5-862e-4f88-b0e9-7d98a681bb3e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.434 187164 DEBUG nova.objects.instance [None req-cd3e6ee5-902f-475d-b2bb-ff9e81cbd989 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: dde775e5-862e-4f88-b0e9-7d98a681bb3e] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.504 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "refresh_cache-2bf13a3e-bb2a-45f0-893e-0eb33fedb85e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.505 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquired lock "refresh_cache-2bf13a3e-bb2a-45f0-893e-0eb33fedb85e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.505 187164 DEBUG nova.network.neutron [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.505 187164 DEBUG nova.objects.instance [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.523 187164 DEBUG nova.virt.libvirt.driver [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.526 187164 DEBUG nova.virt.libvirt.driver [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.526 187164 INFO nova.virt.libvirt.driver [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Creating image(s)#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.528 187164 DEBUG nova.objects.instance [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lazy-loading 'trusted_certs' on Instance uuid bd45ae9f-9649-4347-a5e4-658d02804ef9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.556 187164 DEBUG oslo_concurrency.processutils [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.651 187164 DEBUG oslo_concurrency.processutils [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.652 187164 DEBUG nova.virt.disk.api [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Checking if we can resize image /var/lib/nova/instances/bd45ae9f-9649-4347-a5e4-658d02804ef9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.653 187164 DEBUG oslo_concurrency.processutils [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bd45ae9f-9649-4347-a5e4-658d02804ef9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.707 187164 DEBUG oslo_concurrency.processutils [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bd45ae9f-9649-4347-a5e4-658d02804ef9/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.708 187164 DEBUG nova.virt.disk.api [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Cannot resize image /var/lib/nova/instances/bd45ae9f-9649-4347-a5e4-658d02804ef9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.729 187164 DEBUG nova.virt.libvirt.driver [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.730 187164 DEBUG nova.virt.libvirt.driver [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Ensure instance console log exists: /var/lib/nova/instances/bd45ae9f-9649-4347-a5e4-658d02804ef9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.731 187164 DEBUG oslo_concurrency.lockutils [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.732 187164 DEBUG oslo_concurrency.lockutils [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.733 187164 DEBUG oslo_concurrency.lockutils [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.738 187164 DEBUG nova.virt.libvirt.driver [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Start _get_guest_xml network_info=[{"id": "655d2088-f140-4abd-a02f-295b2208fecf", "address": "fa:16:3e:5e:21:c0", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "vif_mac": "fa:16:3e:5e:21:c0"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap655d2088-f1", "ovs_interfaceid": "655d2088-f140-4abd-a02f-295b2208fecf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T12:39:17Z,direct_url=<?>,disk_format='qcow2',id=f4c3125a-6fd0-40bb-aa00-a7e736ee853d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='83916c53de6f404f91206339303e1b23',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T12:39:21Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'size': 0, 'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'encrypted': False, 'image_id': 'f4c3125a-6fd0-40bb-aa00-a7e736ee853d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.745 187164 WARNING nova.virt.libvirt.driver [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.754 187164 DEBUG nova.virt.libvirt.host [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.755 187164 DEBUG nova.virt.libvirt.host [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.759 187164 DEBUG nova.virt.libvirt.host [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.759 187164 DEBUG nova.virt.libvirt.host [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.761 187164 DEBUG nova.virt.libvirt.driver [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.761 187164 DEBUG nova.virt.hardware [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T12:39:15Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b4ea63be-97f8-4a48-b000-66321c4ddb27',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T12:39:17Z,direct_url=<?>,disk_format='qcow2',id=f4c3125a-6fd0-40bb-aa00-a7e736ee853d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='83916c53de6f404f91206339303e1b23',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T12:39:21Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.762 187164 DEBUG nova.virt.hardware [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.762 187164 DEBUG nova.virt.hardware [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.762 187164 DEBUG nova.virt.hardware [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.762 187164 DEBUG nova.virt.hardware [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.762 187164 DEBUG nova.virt.hardware [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.763 187164 DEBUG nova.virt.hardware [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.763 187164 DEBUG nova.virt.hardware [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.763 187164 DEBUG nova.virt.hardware [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.763 187164 DEBUG nova.virt.hardware [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.764 187164 DEBUG nova.virt.hardware [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.764 187164 DEBUG nova.objects.instance [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lazy-loading 'vcpu_model' on Instance uuid bd45ae9f-9649-4347-a5e4-658d02804ef9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.927 187164 DEBUG oslo_concurrency.processutils [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bd45ae9f-9649-4347-a5e4-658d02804ef9/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.996 187164 DEBUG oslo_concurrency.processutils [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bd45ae9f-9649-4347-a5e4-658d02804ef9/disk.config --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.998 187164 DEBUG oslo_concurrency.lockutils [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "/var/lib/nova/instances/bd45ae9f-9649-4347-a5e4-658d02804ef9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:44:51 np0005546954 nova_compute[187160]: 2025-12-05 12:44:51.999 187164 DEBUG oslo_concurrency.lockutils [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "/var/lib/nova/instances/bd45ae9f-9649-4347-a5e4-658d02804ef9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:44:52 np0005546954 nova_compute[187160]: 2025-12-05 12:44:52.001 187164 DEBUG oslo_concurrency.lockutils [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "/var/lib/nova/instances/bd45ae9f-9649-4347-a5e4-658d02804ef9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:44:52 np0005546954 nova_compute[187160]: 2025-12-05 12:44:52.003 187164 DEBUG nova.virt.libvirt.vif [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:44:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-238469778',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-238469778',id=5,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:44:12Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6b5f383ed0484ca1bde081bf623dad4b',ramdisk_id='',reservation_id='r-8yye3j0y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1570363089',owner_user_name='tempest-TestExecuteActionsViaActuator-1570363089-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:44:46Z,user_data=None,user_id='7ce7ef64754e4a32b4af3272e31a4a5e',uuid=bd45ae9f-9649-4347-a5e4-658d02804ef9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "655d2088-f140-4abd-a02f-295b2208fecf", "address": "fa:16:3e:5e:21:c0", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "vif_mac": "fa:16:3e:5e:21:c0"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap655d2088-f1", "ovs_interfaceid": "655d2088-f140-4abd-a02f-295b2208fecf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:44:52 np0005546954 nova_compute[187160]: 2025-12-05 12:44:52.004 187164 DEBUG nova.network.os_vif_util [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Converting VIF {"id": "655d2088-f140-4abd-a02f-295b2208fecf", "address": "fa:16:3e:5e:21:c0", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "vif_mac": "fa:16:3e:5e:21:c0"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap655d2088-f1", "ovs_interfaceid": "655d2088-f140-4abd-a02f-295b2208fecf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:44:52 np0005546954 nova_compute[187160]: 2025-12-05 12:44:52.005 187164 DEBUG nova.network.os_vif_util [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5e:21:c0,bridge_name='br-int',has_traffic_filtering=True,id=655d2088-f140-4abd-a02f-295b2208fecf,network=Network(ee43e901-b158-4dc0-894f-2384aef8b277),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap655d2088-f1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:44:52 np0005546954 nova_compute[187160]: 2025-12-05 12:44:52.011 187164 DEBUG nova.virt.libvirt.driver [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:44:52 np0005546954 nova_compute[187160]:  <uuid>bd45ae9f-9649-4347-a5e4-658d02804ef9</uuid>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:  <name>instance-00000005</name>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:  <memory>131072</memory>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:  <vcpu>1</vcpu>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:  <metadata>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:44:52 np0005546954 nova_compute[187160]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:      <nova:name>tempest-TestExecuteActionsViaActuator-server-238469778</nova:name>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:      <nova:creationTime>2025-12-05 12:44:51</nova:creationTime>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:      <nova:flavor name="m1.nano">
Dec  5 07:44:52 np0005546954 nova_compute[187160]:        <nova:memory>128</nova:memory>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:        <nova:disk>1</nova:disk>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:        <nova:swap>0</nova:swap>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:      </nova:flavor>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:      <nova:owner>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:        <nova:user uuid="7ce7ef64754e4a32b4af3272e31a4a5e">tempest-TestExecuteActionsViaActuator-1570363089-project-member</nova:user>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:        <nova:project uuid="6b5f383ed0484ca1bde081bf623dad4b">tempest-TestExecuteActionsViaActuator-1570363089</nova:project>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:      </nova:owner>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:      <nova:root type="image" uuid="f4c3125a-6fd0-40bb-aa00-a7e736ee853d"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:      <nova:ports>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:        <nova:port uuid="655d2088-f140-4abd-a02f-295b2208fecf">
Dec  5 07:44:52 np0005546954 nova_compute[187160]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:        </nova:port>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:      </nova:ports>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    </nova:instance>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:  </metadata>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:  <sysinfo type="smbios">
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <system>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:      <entry name="serial">bd45ae9f-9649-4347-a5e4-658d02804ef9</entry>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:      <entry name="uuid">bd45ae9f-9649-4347-a5e4-658d02804ef9</entry>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    </system>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:  </sysinfo>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:  <os>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <boot dev="hd"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <smbios mode="sysinfo"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:  </os>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:  <features>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <acpi/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <apic/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <vmcoreinfo/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:  </features>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:  <clock offset="utc">
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <timer name="hpet" present="no"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:  </clock>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:  <cpu mode="custom" match="exact">
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <model>Nehalem</model>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:  </cpu>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:  <devices>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <disk type="file" device="disk">
Dec  5 07:44:52 np0005546954 nova_compute[187160]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:      <source file="/var/lib/nova/instances/bd45ae9f-9649-4347-a5e4-658d02804ef9/disk"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:      <target dev="vda" bus="virtio"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    </disk>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <disk type="file" device="cdrom">
Dec  5 07:44:52 np0005546954 nova_compute[187160]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:      <source file="/var/lib/nova/instances/bd45ae9f-9649-4347-a5e4-658d02804ef9/disk.config"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:      <target dev="sda" bus="sata"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    </disk>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <interface type="ethernet">
Dec  5 07:44:52 np0005546954 nova_compute[187160]:      <mac address="fa:16:3e:5e:21:c0"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:      <model type="virtio"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:      <mtu size="1442"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:      <target dev="tap655d2088-f1"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    </interface>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <serial type="pty">
Dec  5 07:44:52 np0005546954 nova_compute[187160]:      <log file="/var/lib/nova/instances/bd45ae9f-9649-4347-a5e4-658d02804ef9/console.log" append="off"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    </serial>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <video>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:      <model type="virtio"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    </video>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <input type="tablet" bus="usb"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <rng model="virtio">
Dec  5 07:44:52 np0005546954 nova_compute[187160]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    </rng>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <controller type="usb" index="0"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    <memballoon model="virtio">
Dec  5 07:44:52 np0005546954 nova_compute[187160]:      <stats period="10"/>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:    </memballoon>
Dec  5 07:44:52 np0005546954 nova_compute[187160]:  </devices>
Dec  5 07:44:52 np0005546954 nova_compute[187160]: </domain>
Dec  5 07:44:52 np0005546954 nova_compute[187160]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:44:52 np0005546954 nova_compute[187160]: 2025-12-05 12:44:52.013 187164 DEBUG nova.virt.libvirt.vif [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:44:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-238469778',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-238469778',id=5,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:44:12Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6b5f383ed0484ca1bde081bf623dad4b',ramdisk_id='',reservation_id='r-8yye3j0y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1570363089',owner_user_name='tempest-TestExecuteActionsViaActuator-1570363089-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:44:46Z,user_data=None,user_id='7ce7ef64754e4a32b4af3272e31a4a5e',uuid=bd45ae9f-9649-4347-a5e4-658d02804ef9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "655d2088-f140-4abd-a02f-295b2208fecf", "address": "fa:16:3e:5e:21:c0", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "vif_mac": "fa:16:3e:5e:21:c0"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap655d2088-f1", "ovs_interfaceid": "655d2088-f140-4abd-a02f-295b2208fecf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:44:52 np0005546954 nova_compute[187160]: 2025-12-05 12:44:52.013 187164 DEBUG nova.network.os_vif_util [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Converting VIF {"id": "655d2088-f140-4abd-a02f-295b2208fecf", "address": "fa:16:3e:5e:21:c0", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "vif_mac": "fa:16:3e:5e:21:c0"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap655d2088-f1", "ovs_interfaceid": "655d2088-f140-4abd-a02f-295b2208fecf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:44:52 np0005546954 nova_compute[187160]: 2025-12-05 12:44:52.014 187164 DEBUG nova.network.os_vif_util [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5e:21:c0,bridge_name='br-int',has_traffic_filtering=True,id=655d2088-f140-4abd-a02f-295b2208fecf,network=Network(ee43e901-b158-4dc0-894f-2384aef8b277),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap655d2088-f1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:44:52 np0005546954 nova_compute[187160]: 2025-12-05 12:44:52.015 187164 DEBUG os_vif [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5e:21:c0,bridge_name='br-int',has_traffic_filtering=True,id=655d2088-f140-4abd-a02f-295b2208fecf,network=Network(ee43e901-b158-4dc0-894f-2384aef8b277),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap655d2088-f1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:44:52 np0005546954 nova_compute[187160]: 2025-12-05 12:44:52.017 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:52 np0005546954 nova_compute[187160]: 2025-12-05 12:44:52.017 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:44:52 np0005546954 nova_compute[187160]: 2025-12-05 12:44:52.018 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:44:52 np0005546954 nova_compute[187160]: 2025-12-05 12:44:52.023 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:52 np0005546954 nova_compute[187160]: 2025-12-05 12:44:52.023 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap655d2088-f1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:44:52 np0005546954 nova_compute[187160]: 2025-12-05 12:44:52.025 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap655d2088-f1, col_values=(('external_ids', {'iface-id': '655d2088-f140-4abd-a02f-295b2208fecf', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5e:21:c0', 'vm-uuid': 'bd45ae9f-9649-4347-a5e4-658d02804ef9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:44:52 np0005546954 nova_compute[187160]: 2025-12-05 12:44:52.027 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:52 np0005546954 NetworkManager[55665]: <info>  [1764938692.0281] manager: (tap655d2088-f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Dec  5 07:44:52 np0005546954 nova_compute[187160]: 2025-12-05 12:44:52.031 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:44:52 np0005546954 nova_compute[187160]: 2025-12-05 12:44:52.037 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:52 np0005546954 nova_compute[187160]: 2025-12-05 12:44:52.038 187164 INFO os_vif [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5e:21:c0,bridge_name='br-int',has_traffic_filtering=True,id=655d2088-f140-4abd-a02f-295b2208fecf,network=Network(ee43e901-b158-4dc0-894f-2384aef8b277),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap655d2088-f1')#033[00m
Dec  5 07:44:52 np0005546954 nova_compute[187160]: 2025-12-05 12:44:52.116 187164 DEBUG nova.virt.libvirt.driver [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:44:52 np0005546954 nova_compute[187160]: 2025-12-05 12:44:52.116 187164 DEBUG nova.virt.libvirt.driver [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:44:52 np0005546954 nova_compute[187160]: 2025-12-05 12:44:52.116 187164 DEBUG nova.virt.libvirt.driver [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] No VIF found with MAC fa:16:3e:5e:21:c0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:44:52 np0005546954 nova_compute[187160]: 2025-12-05 12:44:52.117 187164 INFO nova.virt.libvirt.driver [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Using config drive#033[00m
Dec  5 07:44:52 np0005546954 kernel: tap655d2088-f1: entered promiscuous mode
Dec  5 07:44:52 np0005546954 NetworkManager[55665]: <info>  [1764938692.1927] manager: (tap655d2088-f1): new Tun device (/org/freedesktop/NetworkManager/Devices/31)
Dec  5 07:44:52 np0005546954 ovn_controller[95566]: 2025-12-05T12:44:52Z|00047|binding|INFO|Claiming lport 655d2088-f140-4abd-a02f-295b2208fecf for this chassis.
Dec  5 07:44:52 np0005546954 ovn_controller[95566]: 2025-12-05T12:44:52Z|00048|binding|INFO|655d2088-f140-4abd-a02f-295b2208fecf: Claiming fa:16:3e:5e:21:c0 10.100.0.10
Dec  5 07:44:52 np0005546954 nova_compute[187160]: 2025-12-05 12:44:52.194 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:52 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:52.208 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5e:21:c0 10.100.0.10'], port_security=['fa:16:3e:5e:21:c0 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'bd45ae9f-9649-4347-a5e4-658d02804ef9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee43e901-b158-4dc0-894f-2384aef8b277', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6b5f383ed0484ca1bde081bf623dad4b', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'a865bea6-413e-4ecb-bace-2ec9005935f1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8c84cb49-52df-48a9-8d24-aff5b642e12a, chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=655d2088-f140-4abd-a02f-295b2208fecf) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:44:52 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:52.211 104428 INFO neutron.agent.ovn.metadata.agent [-] Port 655d2088-f140-4abd-a02f-295b2208fecf in datapath ee43e901-b158-4dc0-894f-2384aef8b277 bound to our chassis#033[00m
Dec  5 07:44:52 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:52.214 104428 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee43e901-b158-4dc0-894f-2384aef8b277#033[00m
Dec  5 07:44:52 np0005546954 ovn_controller[95566]: 2025-12-05T12:44:52Z|00049|binding|INFO|Setting lport 655d2088-f140-4abd-a02f-295b2208fecf ovn-installed in OVS
Dec  5 07:44:52 np0005546954 ovn_controller[95566]: 2025-12-05T12:44:52Z|00050|binding|INFO|Setting lport 655d2088-f140-4abd-a02f-295b2208fecf up in Southbound
Dec  5 07:44:52 np0005546954 nova_compute[187160]: 2025-12-05 12:44:52.225 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:52 np0005546954 nova_compute[187160]: 2025-12-05 12:44:52.234 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:52 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:52.234 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[5ea2feef-81d6-4b79-88d8-723b219a56dc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:44:52 np0005546954 systemd-udevd[209560]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:44:52 np0005546954 systemd-machined[153497]: New machine qemu-5-instance-00000005.
Dec  5 07:44:52 np0005546954 NetworkManager[55665]: <info>  [1764938692.2639] device (tap655d2088-f1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:44:52 np0005546954 NetworkManager[55665]: <info>  [1764938692.2654] device (tap655d2088-f1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:44:52 np0005546954 systemd[1]: Started Virtual Machine qemu-5-instance-00000005.
Dec  5 07:44:52 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:52.272 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[c7925d92-df00-4fa8-a75c-0d8a0990f278]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:44:52 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:52.275 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[3ff87944-95e4-42a5-a3df-5ba9c62b4271]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:44:52 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:52.318 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[af65e958-214e-4d10-8e2b-47a80ce838c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:44:52 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:52.339 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[ba94f2a7-12ff-446c-9705-73797f0a453a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee43e901-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:16:af:4f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 32, 'tx_packets': 11, 'rx_bytes': 1624, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 32, 'tx_packets': 11, 'rx_bytes': 1624, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 358945, 'reachable_time': 17578, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209571, 'error': None, 'target': 'ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:44:52 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:52.364 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[26b7a41c-d0f2-4dfa-93d0-96aab3b47727]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapee43e901-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 358963, 'tstamp': 358963}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209572, 'error': None, 'target': 'ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapee43e901-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 358968, 'tstamp': 358968}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209572, 'error': None, 'target': 'ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:44:52 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:52.366 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee43e901-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:44:52 np0005546954 nova_compute[187160]: 2025-12-05 12:44:52.368 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:52 np0005546954 nova_compute[187160]: 2025-12-05 12:44:52.370 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:52 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:52.370 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee43e901-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:44:52 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:52.371 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:44:52 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:52.372 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee43e901-b0, col_values=(('external_ids', {'iface-id': 'ff42a43f-b4ac-4be3-b747-f3c0a6e67328'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:44:52 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:44:52.372 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:44:52 np0005546954 nova_compute[187160]: 2025-12-05 12:44:52.622 187164 DEBUG nova.compute.manager [req-87d6f8c1-bbad-4604-a160-21c166f58280 req-fceb513e-55f9-4dc7-b0e6-b26a732dfe2d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Received event network-vif-plugged-655d2088-f140-4abd-a02f-295b2208fecf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:44:52 np0005546954 nova_compute[187160]: 2025-12-05 12:44:52.623 187164 DEBUG oslo_concurrency.lockutils [req-87d6f8c1-bbad-4604-a160-21c166f58280 req-fceb513e-55f9-4dc7-b0e6-b26a732dfe2d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "bd45ae9f-9649-4347-a5e4-658d02804ef9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:44:52 np0005546954 nova_compute[187160]: 2025-12-05 12:44:52.623 187164 DEBUG oslo_concurrency.lockutils [req-87d6f8c1-bbad-4604-a160-21c166f58280 req-fceb513e-55f9-4dc7-b0e6-b26a732dfe2d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "bd45ae9f-9649-4347-a5e4-658d02804ef9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:44:52 np0005546954 nova_compute[187160]: 2025-12-05 12:44:52.623 187164 DEBUG oslo_concurrency.lockutils [req-87d6f8c1-bbad-4604-a160-21c166f58280 req-fceb513e-55f9-4dc7-b0e6-b26a732dfe2d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "bd45ae9f-9649-4347-a5e4-658d02804ef9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:44:52 np0005546954 nova_compute[187160]: 2025-12-05 12:44:52.623 187164 DEBUG nova.compute.manager [req-87d6f8c1-bbad-4604-a160-21c166f58280 req-fceb513e-55f9-4dc7-b0e6-b26a732dfe2d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] No waiting events found dispatching network-vif-plugged-655d2088-f140-4abd-a02f-295b2208fecf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:44:52 np0005546954 nova_compute[187160]: 2025-12-05 12:44:52.624 187164 WARNING nova.compute.manager [req-87d6f8c1-bbad-4604-a160-21c166f58280 req-fceb513e-55f9-4dc7-b0e6-b26a732dfe2d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Received unexpected event network-vif-plugged-655d2088-f140-4abd-a02f-295b2208fecf for instance with vm_state active and task_state resize_finish.#033[00m
Dec  5 07:44:52 np0005546954 nova_compute[187160]: 2025-12-05 12:44:52.863 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764938692.862584, bd45ae9f-9649-4347-a5e4-658d02804ef9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:44:52 np0005546954 nova_compute[187160]: 2025-12-05 12:44:52.864 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:44:52 np0005546954 nova_compute[187160]: 2025-12-05 12:44:52.868 187164 DEBUG nova.compute.manager [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:44:52 np0005546954 nova_compute[187160]: 2025-12-05 12:44:52.875 187164 INFO nova.virt.libvirt.driver [-] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Instance running successfully.#033[00m
Dec  5 07:44:52 np0005546954 virtqemud[186730]: argument unsupported: QEMU guest agent is not configured
Dec  5 07:44:52 np0005546954 nova_compute[187160]: 2025-12-05 12:44:52.882 187164 DEBUG nova.virt.libvirt.guest [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Dec  5 07:44:52 np0005546954 nova_compute[187160]: 2025-12-05 12:44:52.883 187164 DEBUG nova.virt.libvirt.driver [None req-30410424-0a3d-4ea9-bcc3-60006669dedb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Dec  5 07:44:52 np0005546954 nova_compute[187160]: 2025-12-05 12:44:52.955 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:44:52 np0005546954 nova_compute[187160]: 2025-12-05 12:44:52.960 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:44:52 np0005546954 nova_compute[187160]: 2025-12-05 12:44:52.996 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Dec  5 07:44:52 np0005546954 nova_compute[187160]: 2025-12-05 12:44:52.997 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764938692.8655608, bd45ae9f-9649-4347-a5e4-658d02804ef9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:44:52 np0005546954 nova_compute[187160]: 2025-12-05 12:44:52.997 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] VM Started (Lifecycle Event)#033[00m
Dec  5 07:44:53 np0005546954 nova_compute[187160]: 2025-12-05 12:44:53.032 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:44:53 np0005546954 nova_compute[187160]: 2025-12-05 12:44:53.037 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:44:53 np0005546954 nova_compute[187160]: 2025-12-05 12:44:53.364 187164 DEBUG nova.network.neutron [req-9037f59d-04d1-4ee6-8938-f7b17b3047d0 req-612ee2fe-07fc-44a4-846d-b4cf069b0dd3 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Updated VIF entry in instance network info cache for port 655d2088-f140-4abd-a02f-295b2208fecf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:44:53 np0005546954 nova_compute[187160]: 2025-12-05 12:44:53.365 187164 DEBUG nova.network.neutron [req-9037f59d-04d1-4ee6-8938-f7b17b3047d0 req-612ee2fe-07fc-44a4-846d-b4cf069b0dd3 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Updating instance_info_cache with network_info: [{"id": "655d2088-f140-4abd-a02f-295b2208fecf", "address": "fa:16:3e:5e:21:c0", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap655d2088-f1", "ovs_interfaceid": "655d2088-f140-4abd-a02f-295b2208fecf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:44:53 np0005546954 nova_compute[187160]: 2025-12-05 12:44:53.393 187164 DEBUG oslo_concurrency.lockutils [req-9037f59d-04d1-4ee6-8938-f7b17b3047d0 req-612ee2fe-07fc-44a4-846d-b4cf069b0dd3 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Releasing lock "refresh_cache-bd45ae9f-9649-4347-a5e4-658d02804ef9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:44:53 np0005546954 nova_compute[187160]: 2025-12-05 12:44:53.450 187164 DEBUG nova.network.neutron [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Updating instance_info_cache with network_info: [{"id": "f9c0965a-861e-4c24-9c97-679c6d706267", "address": "fa:16:3e:99:40:2c", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9c0965a-86", "ovs_interfaceid": "f9c0965a-861e-4c24-9c97-679c6d706267", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:44:53 np0005546954 nova_compute[187160]: 2025-12-05 12:44:53.466 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Releasing lock "refresh_cache-2bf13a3e-bb2a-45f0-893e-0eb33fedb85e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:44:53 np0005546954 nova_compute[187160]: 2025-12-05 12:44:53.466 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec  5 07:44:53 np0005546954 nova_compute[187160]: 2025-12-05 12:44:53.466 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:44:53 np0005546954 nova_compute[187160]: 2025-12-05 12:44:53.467 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:44:54 np0005546954 nova_compute[187160]: 2025-12-05 12:44:54.050 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:44:54 np0005546954 nova_compute[187160]: 2025-12-05 12:44:54.055 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:54 np0005546954 nova_compute[187160]: 2025-12-05 12:44:54.079 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:44:54 np0005546954 podman[209585]: 2025-12-05 12:44:54.59261428 +0000 UTC m=+0.088146646 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, managed_by=edpm_ansible, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., architecture=x86_64, release=1755695350, config_id=edpm, name=ubi9-minimal, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.33.7)
Dec  5 07:44:54 np0005546954 podman[209586]: 2025-12-05 12:44:54.613032438 +0000 UTC m=+0.114283043 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd)
Dec  5 07:44:54 np0005546954 nova_compute[187160]: 2025-12-05 12:44:54.736 187164 DEBUG nova.compute.manager [req-d8f9a8d1-c0cc-4090-bcaf-d161f484bc16 req-4a64c27c-2063-4834-af3c-a4e72bcf9d7f 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Received event network-vif-plugged-655d2088-f140-4abd-a02f-295b2208fecf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:44:54 np0005546954 nova_compute[187160]: 2025-12-05 12:44:54.737 187164 DEBUG oslo_concurrency.lockutils [req-d8f9a8d1-c0cc-4090-bcaf-d161f484bc16 req-4a64c27c-2063-4834-af3c-a4e72bcf9d7f 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "bd45ae9f-9649-4347-a5e4-658d02804ef9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:44:54 np0005546954 nova_compute[187160]: 2025-12-05 12:44:54.737 187164 DEBUG oslo_concurrency.lockutils [req-d8f9a8d1-c0cc-4090-bcaf-d161f484bc16 req-4a64c27c-2063-4834-af3c-a4e72bcf9d7f 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "bd45ae9f-9649-4347-a5e4-658d02804ef9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:44:54 np0005546954 nova_compute[187160]: 2025-12-05 12:44:54.737 187164 DEBUG oslo_concurrency.lockutils [req-d8f9a8d1-c0cc-4090-bcaf-d161f484bc16 req-4a64c27c-2063-4834-af3c-a4e72bcf9d7f 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "bd45ae9f-9649-4347-a5e4-658d02804ef9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:44:54 np0005546954 nova_compute[187160]: 2025-12-05 12:44:54.738 187164 DEBUG nova.compute.manager [req-d8f9a8d1-c0cc-4090-bcaf-d161f484bc16 req-4a64c27c-2063-4834-af3c-a4e72bcf9d7f 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] No waiting events found dispatching network-vif-plugged-655d2088-f140-4abd-a02f-295b2208fecf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:44:54 np0005546954 nova_compute[187160]: 2025-12-05 12:44:54.738 187164 WARNING nova.compute.manager [req-d8f9a8d1-c0cc-4090-bcaf-d161f484bc16 req-4a64c27c-2063-4834-af3c-a4e72bcf9d7f 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Received unexpected event network-vif-plugged-655d2088-f140-4abd-a02f-295b2208fecf for instance with vm_state resized and task_state None.#033[00m
Dec  5 07:44:54 np0005546954 systemd[1]: Stopping User Manager for UID 42436...
Dec  5 07:44:54 np0005546954 systemd[209350]: Activating special unit Exit the Session...
Dec  5 07:44:54 np0005546954 systemd[209350]: Stopped target Main User Target.
Dec  5 07:44:54 np0005546954 systemd[209350]: Stopped target Basic System.
Dec  5 07:44:54 np0005546954 systemd[209350]: Stopped target Paths.
Dec  5 07:44:54 np0005546954 systemd[209350]: Stopped target Sockets.
Dec  5 07:44:54 np0005546954 systemd[209350]: Stopped target Timers.
Dec  5 07:44:54 np0005546954 systemd[209350]: Stopped Mark boot as successful after the user session has run 2 minutes.
Dec  5 07:44:54 np0005546954 systemd[209350]: Stopped Daily Cleanup of User's Temporary Directories.
Dec  5 07:44:54 np0005546954 systemd[209350]: Closed D-Bus User Message Bus Socket.
Dec  5 07:44:54 np0005546954 systemd[209350]: Stopped Create User's Volatile Files and Directories.
Dec  5 07:44:54 np0005546954 systemd[209350]: Removed slice User Application Slice.
Dec  5 07:44:54 np0005546954 systemd[209350]: Reached target Shutdown.
Dec  5 07:44:54 np0005546954 systemd[209350]: Finished Exit the Session.
Dec  5 07:44:54 np0005546954 systemd[209350]: Reached target Exit the Session.
Dec  5 07:44:54 np0005546954 systemd[1]: user@42436.service: Deactivated successfully.
Dec  5 07:44:54 np0005546954 systemd[1]: Stopped User Manager for UID 42436.
Dec  5 07:44:54 np0005546954 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Dec  5 07:44:55 np0005546954 systemd[1]: run-user-42436.mount: Deactivated successfully.
Dec  5 07:44:55 np0005546954 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Dec  5 07:44:55 np0005546954 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Dec  5 07:44:55 np0005546954 systemd[1]: Removed slice User Slice of UID 42436.
Dec  5 07:44:56 np0005546954 nova_compute[187160]: 2025-12-05 12:44:56.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:44:56 np0005546954 nova_compute[187160]: 2025-12-05 12:44:56.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:44:57 np0005546954 nova_compute[187160]: 2025-12-05 12:44:57.035 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:44:57 np0005546954 nova_compute[187160]: 2025-12-05 12:44:57.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:44:57 np0005546954 nova_compute[187160]: 2025-12-05 12:44:57.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:44:57 np0005546954 nova_compute[187160]: 2025-12-05 12:44:57.066 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:57 np0005546954 nova_compute[187160]: 2025-12-05 12:44:57.127 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:44:57 np0005546954 nova_compute[187160]: 2025-12-05 12:44:57.127 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:44:57 np0005546954 nova_compute[187160]: 2025-12-05 12:44:57.128 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:44:57 np0005546954 nova_compute[187160]: 2025-12-05 12:44:57.128 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:44:57 np0005546954 nova_compute[187160]: 2025-12-05 12:44:57.249 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bd45ae9f-9649-4347-a5e4-658d02804ef9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:44:57 np0005546954 nova_compute[187160]: 2025-12-05 12:44:57.303 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bd45ae9f-9649-4347-a5e4-658d02804ef9/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:44:57 np0005546954 nova_compute[187160]: 2025-12-05 12:44:57.303 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bd45ae9f-9649-4347-a5e4-658d02804ef9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:44:57 np0005546954 nova_compute[187160]: 2025-12-05 12:44:57.354 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bd45ae9f-9649-4347-a5e4-658d02804ef9/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:44:57 np0005546954 nova_compute[187160]: 2025-12-05 12:44:57.359 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2bf13a3e-bb2a-45f0-893e-0eb33fedb85e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:44:57 np0005546954 nova_compute[187160]: 2025-12-05 12:44:57.411 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2bf13a3e-bb2a-45f0-893e-0eb33fedb85e/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:44:57 np0005546954 nova_compute[187160]: 2025-12-05 12:44:57.412 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2bf13a3e-bb2a-45f0-893e-0eb33fedb85e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:44:57 np0005546954 nova_compute[187160]: 2025-12-05 12:44:57.468 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2bf13a3e-bb2a-45f0-893e-0eb33fedb85e/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:44:57 np0005546954 nova_compute[187160]: 2025-12-05 12:44:57.475 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dde775e5-862e-4f88-b0e9-7d98a681bb3e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:44:57 np0005546954 nova_compute[187160]: 2025-12-05 12:44:57.550 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dde775e5-862e-4f88-b0e9-7d98a681bb3e/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:44:57 np0005546954 nova_compute[187160]: 2025-12-05 12:44:57.552 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dde775e5-862e-4f88-b0e9-7d98a681bb3e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:44:57 np0005546954 nova_compute[187160]: 2025-12-05 12:44:57.605 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dde775e5-862e-4f88-b0e9-7d98a681bb3e/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:44:57 np0005546954 nova_compute[187160]: 2025-12-05 12:44:57.614 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0513a02c-7fe2-43aa-9bd6-020014460672/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:44:57 np0005546954 nova_compute[187160]: 2025-12-05 12:44:57.670 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0513a02c-7fe2-43aa-9bd6-020014460672/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:44:57 np0005546954 nova_compute[187160]: 2025-12-05 12:44:57.672 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0513a02c-7fe2-43aa-9bd6-020014460672/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:44:57 np0005546954 nova_compute[187160]: 2025-12-05 12:44:57.732 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0513a02c-7fe2-43aa-9bd6-020014460672/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:44:57 np0005546954 nova_compute[187160]: 2025-12-05 12:44:57.739 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a133dad5-02c4-4021-90e5-ee9f3322f351/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:44:57 np0005546954 nova_compute[187160]: 2025-12-05 12:44:57.795 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a133dad5-02c4-4021-90e5-ee9f3322f351/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:44:57 np0005546954 nova_compute[187160]: 2025-12-05 12:44:57.796 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a133dad5-02c4-4021-90e5-ee9f3322f351/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:44:57 np0005546954 nova_compute[187160]: 2025-12-05 12:44:57.848 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a133dad5-02c4-4021-90e5-ee9f3322f351/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:44:58 np0005546954 nova_compute[187160]: 2025-12-05 12:44:58.031 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:44:58 np0005546954 nova_compute[187160]: 2025-12-05 12:44:58.033 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5099MB free_disk=73.19702529907227GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:44:58 np0005546954 nova_compute[187160]: 2025-12-05 12:44:58.034 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:44:58 np0005546954 nova_compute[187160]: 2025-12-05 12:44:58.034 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:44:58 np0005546954 nova_compute[187160]: 2025-12-05 12:44:58.660 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Applying migration context for instance bd45ae9f-9649-4347-a5e4-658d02804ef9 as it has an incoming, in-progress migration a45b1cf8-9c77-4611-bf5f-26795e8b6e03. Migration status is confirming _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:950#033[00m
Dec  5 07:44:58 np0005546954 nova_compute[187160]: 2025-12-05 12:44:58.677 187164 INFO nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Updating resource usage from migration a45b1cf8-9c77-4611-bf5f-26795e8b6e03#033[00m
Dec  5 07:44:58 np0005546954 nova_compute[187160]: 2025-12-05 12:44:58.716 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Instance 0513a02c-7fe2-43aa-9bd6-020014460672 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:44:58 np0005546954 nova_compute[187160]: 2025-12-05 12:44:58.717 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Instance 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:44:58 np0005546954 nova_compute[187160]: 2025-12-05 12:44:58.717 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Instance a133dad5-02c4-4021-90e5-ee9f3322f351 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:44:58 np0005546954 nova_compute[187160]: 2025-12-05 12:44:58.718 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Instance bd45ae9f-9649-4347-a5e4-658d02804ef9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:44:58 np0005546954 nova_compute[187160]: 2025-12-05 12:44:58.718 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Instance dde775e5-862e-4f88-b0e9-7d98a681bb3e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:44:58 np0005546954 nova_compute[187160]: 2025-12-05 12:44:58.718 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 5 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:44:58 np0005546954 nova_compute[187160]: 2025-12-05 12:44:58.719 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=1152MB phys_disk=79GB used_disk=5GB total_vcpus=8 used_vcpus=5 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:44:58 np0005546954 nova_compute[187160]: 2025-12-05 12:44:58.904 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:44:59 np0005546954 nova_compute[187160]: 2025-12-05 12:44:59.057 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:44:59 np0005546954 nova_compute[187160]: 2025-12-05 12:44:59.069 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:44:59 np0005546954 nova_compute[187160]: 2025-12-05 12:44:59.097 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:44:59 np0005546954 nova_compute[187160]: 2025-12-05 12:44:59.097 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.063s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:45:02 np0005546954 nova_compute[187160]: 2025-12-05 12:45:02.068 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:03 np0005546954 nova_compute[187160]: 2025-12-05 12:45:03.098 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:45:04 np0005546954 nova_compute[187160]: 2025-12-05 12:45:04.060 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:04 np0005546954 nova_compute[187160]: 2025-12-05 12:45:04.283 187164 DEBUG oslo_concurrency.lockutils [None req-65daf801-68fe-43d7-bad9-83543ccb37eb 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Acquiring lock "a133dad5-02c4-4021-90e5-ee9f3322f351" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:45:04 np0005546954 nova_compute[187160]: 2025-12-05 12:45:04.284 187164 DEBUG oslo_concurrency.lockutils [None req-65daf801-68fe-43d7-bad9-83543ccb37eb 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "a133dad5-02c4-4021-90e5-ee9f3322f351" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:45:04 np0005546954 nova_compute[187160]: 2025-12-05 12:45:04.284 187164 DEBUG oslo_concurrency.lockutils [None req-65daf801-68fe-43d7-bad9-83543ccb37eb 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Acquiring lock "a133dad5-02c4-4021-90e5-ee9f3322f351-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:45:04 np0005546954 nova_compute[187160]: 2025-12-05 12:45:04.286 187164 DEBUG oslo_concurrency.lockutils [None req-65daf801-68fe-43d7-bad9-83543ccb37eb 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "a133dad5-02c4-4021-90e5-ee9f3322f351-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:45:04 np0005546954 nova_compute[187160]: 2025-12-05 12:45:04.286 187164 DEBUG oslo_concurrency.lockutils [None req-65daf801-68fe-43d7-bad9-83543ccb37eb 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "a133dad5-02c4-4021-90e5-ee9f3322f351-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:45:04 np0005546954 nova_compute[187160]: 2025-12-05 12:45:04.289 187164 INFO nova.compute.manager [None req-65daf801-68fe-43d7-bad9-83543ccb37eb 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Terminating instance#033[00m
Dec  5 07:45:04 np0005546954 nova_compute[187160]: 2025-12-05 12:45:04.291 187164 DEBUG nova.compute.manager [None req-65daf801-68fe-43d7-bad9-83543ccb37eb 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:45:04 np0005546954 kernel: tap1f839a4f-9b (unregistering): left promiscuous mode
Dec  5 07:45:04 np0005546954 NetworkManager[55665]: <info>  [1764938704.3268] device (tap1f839a4f-9b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:45:04 np0005546954 ovn_controller[95566]: 2025-12-05T12:45:04Z|00051|binding|INFO|Releasing lport 1f839a4f-9bf4-4b39-aca7-83959475d57e from this chassis (sb_readonly=0)
Dec  5 07:45:04 np0005546954 nova_compute[187160]: 2025-12-05 12:45:04.341 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:04 np0005546954 ovn_controller[95566]: 2025-12-05T12:45:04Z|00052|binding|INFO|Setting lport 1f839a4f-9bf4-4b39-aca7-83959475d57e down in Southbound
Dec  5 07:45:04 np0005546954 ovn_controller[95566]: 2025-12-05T12:45:04Z|00053|binding|INFO|Removing iface tap1f839a4f-9b ovn-installed in OVS
Dec  5 07:45:04 np0005546954 nova_compute[187160]: 2025-12-05 12:45:04.348 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:04 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:04.358 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bd:0e:16 10.100.0.7'], port_security=['fa:16:3e:bd:0e:16 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'a133dad5-02c4-4021-90e5-ee9f3322f351', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee43e901-b158-4dc0-894f-2384aef8b277', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6b5f383ed0484ca1bde081bf623dad4b', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a865bea6-413e-4ecb-bace-2ec9005935f1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8c84cb49-52df-48a9-8d24-aff5b642e12a, chassis=[], tunnel_key=8, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=1f839a4f-9bf4-4b39-aca7-83959475d57e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:45:04 np0005546954 nova_compute[187160]: 2025-12-05 12:45:04.360 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:04 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:04.361 104428 INFO neutron.agent.ovn.metadata.agent [-] Port 1f839a4f-9bf4-4b39-aca7-83959475d57e in datapath ee43e901-b158-4dc0-894f-2384aef8b277 unbound from our chassis#033[00m
Dec  5 07:45:04 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:04.365 104428 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee43e901-b158-4dc0-894f-2384aef8b277#033[00m
Dec  5 07:45:04 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:04.383 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[89e55967-5633-4621-9c47-a98ef5d8cb76]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:45:04 np0005546954 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000006.scope: Deactivated successfully.
Dec  5 07:45:04 np0005546954 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000006.scope: Consumed 14.626s CPU time.
Dec  5 07:45:04 np0005546954 systemd-machined[153497]: Machine qemu-3-instance-00000006 terminated.
Dec  5 07:45:04 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:04.415 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[072a3198-3206-4a8d-af14-f8106b7d10cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:45:04 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:04.419 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[0d5f0bde-1d96-4fcc-bb5b-d0dc3d901f86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:45:04 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:04.453 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[9c5963a7-d347-4576-bc80-c5d77c8007d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:45:04 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:04.476 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[44b729b5-8381-4673-a010-5187320428cc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee43e901-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:16:af:4f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 32, 'tx_packets': 13, 'rx_bytes': 1624, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 32, 'tx_packets': 13, 'rx_bytes': 1624, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 358945, 'reachable_time': 17578, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209682, 'error': None, 'target': 'ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:45:04 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:04.500 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[0f0d9675-a90d-48ce-915e-96723504201b]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapee43e901-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 358963, 'tstamp': 358963}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209683, 'error': None, 'target': 'ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapee43e901-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 358968, 'tstamp': 358968}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209683, 'error': None, 'target': 'ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:45:04 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:04.502 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee43e901-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:45:04 np0005546954 nova_compute[187160]: 2025-12-05 12:45:04.504 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:04 np0005546954 nova_compute[187160]: 2025-12-05 12:45:04.515 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:04 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:04.516 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee43e901-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:45:04 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:04.517 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:45:04 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:04.517 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee43e901-b0, col_values=(('external_ids', {'iface-id': 'ff42a43f-b4ac-4be3-b747-f3c0a6e67328'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:45:04 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:04.518 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:45:04 np0005546954 nova_compute[187160]: 2025-12-05 12:45:04.567 187164 INFO nova.virt.libvirt.driver [-] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Instance destroyed successfully.#033[00m
Dec  5 07:45:04 np0005546954 nova_compute[187160]: 2025-12-05 12:45:04.568 187164 DEBUG nova.objects.instance [None req-65daf801-68fe-43d7-bad9-83543ccb37eb 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lazy-loading 'resources' on Instance uuid a133dad5-02c4-4021-90e5-ee9f3322f351 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:45:04 np0005546954 nova_compute[187160]: 2025-12-05 12:45:04.591 187164 DEBUG nova.virt.libvirt.vif [None req-65daf801-68fe-43d7-bad9-83543ccb37eb 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:44:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1937603378',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1937603378',id=6,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:44:27Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6b5f383ed0484ca1bde081bf623dad4b',ramdisk_id='',reservation_id='r-kl33bqt4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1570363089',owner_user_name='tempest-TestExecuteActionsViaActuator-1570363089-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:44:27Z,user_data=None,user_id='7ce7ef64754e4a32b4af3272e31a4a5e',uuid=a133dad5-02c4-4021-90e5-ee9f3322f351,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1f839a4f-9bf4-4b39-aca7-83959475d57e", "address": "fa:16:3e:bd:0e:16", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f839a4f-9b", "ovs_interfaceid": "1f839a4f-9bf4-4b39-aca7-83959475d57e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:45:04 np0005546954 nova_compute[187160]: 2025-12-05 12:45:04.592 187164 DEBUG nova.network.os_vif_util [None req-65daf801-68fe-43d7-bad9-83543ccb37eb 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Converting VIF {"id": "1f839a4f-9bf4-4b39-aca7-83959475d57e", "address": "fa:16:3e:bd:0e:16", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f839a4f-9b", "ovs_interfaceid": "1f839a4f-9bf4-4b39-aca7-83959475d57e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:45:04 np0005546954 nova_compute[187160]: 2025-12-05 12:45:04.592 187164 DEBUG nova.network.os_vif_util [None req-65daf801-68fe-43d7-bad9-83543ccb37eb 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bd:0e:16,bridge_name='br-int',has_traffic_filtering=True,id=1f839a4f-9bf4-4b39-aca7-83959475d57e,network=Network(ee43e901-b158-4dc0-894f-2384aef8b277),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f839a4f-9b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:45:04 np0005546954 nova_compute[187160]: 2025-12-05 12:45:04.593 187164 DEBUG os_vif [None req-65daf801-68fe-43d7-bad9-83543ccb37eb 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bd:0e:16,bridge_name='br-int',has_traffic_filtering=True,id=1f839a4f-9bf4-4b39-aca7-83959475d57e,network=Network(ee43e901-b158-4dc0-894f-2384aef8b277),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f839a4f-9b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:45:04 np0005546954 nova_compute[187160]: 2025-12-05 12:45:04.595 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:04 np0005546954 nova_compute[187160]: 2025-12-05 12:45:04.596 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1f839a4f-9b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:45:04 np0005546954 nova_compute[187160]: 2025-12-05 12:45:04.598 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:04 np0005546954 nova_compute[187160]: 2025-12-05 12:45:04.600 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:04 np0005546954 nova_compute[187160]: 2025-12-05 12:45:04.604 187164 INFO os_vif [None req-65daf801-68fe-43d7-bad9-83543ccb37eb 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bd:0e:16,bridge_name='br-int',has_traffic_filtering=True,id=1f839a4f-9bf4-4b39-aca7-83959475d57e,network=Network(ee43e901-b158-4dc0-894f-2384aef8b277),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f839a4f-9b')#033[00m
Dec  5 07:45:04 np0005546954 nova_compute[187160]: 2025-12-05 12:45:04.605 187164 INFO nova.virt.libvirt.driver [None req-65daf801-68fe-43d7-bad9-83543ccb37eb 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Deleting instance files /var/lib/nova/instances/a133dad5-02c4-4021-90e5-ee9f3322f351_del#033[00m
Dec  5 07:45:04 np0005546954 nova_compute[187160]: 2025-12-05 12:45:04.606 187164 INFO nova.virt.libvirt.driver [None req-65daf801-68fe-43d7-bad9-83543ccb37eb 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Deletion of /var/lib/nova/instances/a133dad5-02c4-4021-90e5-ee9f3322f351_del complete#033[00m
Dec  5 07:45:04 np0005546954 nova_compute[187160]: 2025-12-05 12:45:04.716 187164 DEBUG nova.virt.libvirt.host [None req-65daf801-68fe-43d7-bad9-83543ccb37eb 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754#033[00m
Dec  5 07:45:04 np0005546954 nova_compute[187160]: 2025-12-05 12:45:04.717 187164 INFO nova.virt.libvirt.host [None req-65daf801-68fe-43d7-bad9-83543ccb37eb 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] UEFI support detected#033[00m
Dec  5 07:45:04 np0005546954 nova_compute[187160]: 2025-12-05 12:45:04.719 187164 INFO nova.compute.manager [None req-65daf801-68fe-43d7-bad9-83543ccb37eb 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Took 0.43 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:45:04 np0005546954 nova_compute[187160]: 2025-12-05 12:45:04.720 187164 DEBUG oslo.service.loopingcall [None req-65daf801-68fe-43d7-bad9-83543ccb37eb 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:45:04 np0005546954 nova_compute[187160]: 2025-12-05 12:45:04.720 187164 DEBUG nova.compute.manager [-] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:45:04 np0005546954 nova_compute[187160]: 2025-12-05 12:45:04.720 187164 DEBUG nova.network.neutron [-] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:45:05 np0005546954 ovn_controller[95566]: 2025-12-05T12:45:05Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5e:21:c0 10.100.0.10
Dec  5 07:45:05 np0005546954 podman[197513]: time="2025-12-05T12:45:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:45:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:45:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Dec  5 07:45:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:45:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3040 "" "Go-http-client/1.1"
Dec  5 07:45:05 np0005546954 nova_compute[187160]: 2025-12-05 12:45:05.905 187164 DEBUG nova.compute.manager [req-4e535008-a658-4e7d-a304-33c13d6d4adb req-d28afa49-6202-4a2e-991c-6afa0298be7e 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Received event network-vif-unplugged-1f839a4f-9bf4-4b39-aca7-83959475d57e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:45:05 np0005546954 nova_compute[187160]: 2025-12-05 12:45:05.906 187164 DEBUG oslo_concurrency.lockutils [req-4e535008-a658-4e7d-a304-33c13d6d4adb req-d28afa49-6202-4a2e-991c-6afa0298be7e 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "a133dad5-02c4-4021-90e5-ee9f3322f351-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:45:05 np0005546954 nova_compute[187160]: 2025-12-05 12:45:05.906 187164 DEBUG oslo_concurrency.lockutils [req-4e535008-a658-4e7d-a304-33c13d6d4adb req-d28afa49-6202-4a2e-991c-6afa0298be7e 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "a133dad5-02c4-4021-90e5-ee9f3322f351-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:45:05 np0005546954 nova_compute[187160]: 2025-12-05 12:45:05.906 187164 DEBUG oslo_concurrency.lockutils [req-4e535008-a658-4e7d-a304-33c13d6d4adb req-d28afa49-6202-4a2e-991c-6afa0298be7e 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "a133dad5-02c4-4021-90e5-ee9f3322f351-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:45:05 np0005546954 nova_compute[187160]: 2025-12-05 12:45:05.906 187164 DEBUG nova.compute.manager [req-4e535008-a658-4e7d-a304-33c13d6d4adb req-d28afa49-6202-4a2e-991c-6afa0298be7e 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] No waiting events found dispatching network-vif-unplugged-1f839a4f-9bf4-4b39-aca7-83959475d57e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:45:05 np0005546954 nova_compute[187160]: 2025-12-05 12:45:05.906 187164 DEBUG nova.compute.manager [req-4e535008-a658-4e7d-a304-33c13d6d4adb req-d28afa49-6202-4a2e-991c-6afa0298be7e 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Received event network-vif-unplugged-1f839a4f-9bf4-4b39-aca7-83959475d57e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  5 07:45:06 np0005546954 nova_compute[187160]: 2025-12-05 12:45:06.389 187164 DEBUG nova.network.neutron [-] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:45:06 np0005546954 nova_compute[187160]: 2025-12-05 12:45:06.478 187164 DEBUG nova.compute.manager [req-3258f5d4-eb52-4725-bb8a-278cf6045eb2 req-5d62e7a4-ba5e-4312-a980-f71c4979e052 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Received event network-vif-deleted-1f839a4f-9bf4-4b39-aca7-83959475d57e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:45:06 np0005546954 nova_compute[187160]: 2025-12-05 12:45:06.478 187164 INFO nova.compute.manager [req-3258f5d4-eb52-4725-bb8a-278cf6045eb2 req-5d62e7a4-ba5e-4312-a980-f71c4979e052 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Neutron deleted interface 1f839a4f-9bf4-4b39-aca7-83959475d57e; detaching it from the instance and deleting it from the info cache#033[00m
Dec  5 07:45:06 np0005546954 nova_compute[187160]: 2025-12-05 12:45:06.479 187164 DEBUG nova.network.neutron [req-3258f5d4-eb52-4725-bb8a-278cf6045eb2 req-5d62e7a4-ba5e-4312-a980-f71c4979e052 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:45:06 np0005546954 nova_compute[187160]: 2025-12-05 12:45:06.540 187164 INFO nova.compute.manager [-] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Took 1.82 seconds to deallocate network for instance.#033[00m
Dec  5 07:45:06 np0005546954 nova_compute[187160]: 2025-12-05 12:45:06.548 187164 DEBUG nova.compute.manager [req-3258f5d4-eb52-4725-bb8a-278cf6045eb2 req-5d62e7a4-ba5e-4312-a980-f71c4979e052 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Detach interface failed, port_id=1f839a4f-9bf4-4b39-aca7-83959475d57e, reason: Instance a133dad5-02c4-4021-90e5-ee9f3322f351 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Dec  5 07:45:06 np0005546954 nova_compute[187160]: 2025-12-05 12:45:06.679 187164 DEBUG oslo_concurrency.lockutils [None req-65daf801-68fe-43d7-bad9-83543ccb37eb 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:45:06 np0005546954 nova_compute[187160]: 2025-12-05 12:45:06.679 187164 DEBUG oslo_concurrency.lockutils [None req-65daf801-68fe-43d7-bad9-83543ccb37eb 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:45:06 np0005546954 nova_compute[187160]: 2025-12-05 12:45:06.828 187164 DEBUG nova.compute.provider_tree [None req-65daf801-68fe-43d7-bad9-83543ccb37eb 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:45:06 np0005546954 nova_compute[187160]: 2025-12-05 12:45:06.839 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:45:06 np0005546954 nova_compute[187160]: 2025-12-05 12:45:06.893 187164 DEBUG nova.scheduler.client.report [None req-65daf801-68fe-43d7-bad9-83543ccb37eb 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:45:06 np0005546954 nova_compute[187160]: 2025-12-05 12:45:06.903 187164 WARNING nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] While synchronizing instance power states, found 5 instances in the database and 4 instances on the hypervisor.#033[00m
Dec  5 07:45:06 np0005546954 nova_compute[187160]: 2025-12-05 12:45:06.904 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Triggering sync for uuid 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Dec  5 07:45:06 np0005546954 nova_compute[187160]: 2025-12-05 12:45:06.904 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Triggering sync for uuid dde775e5-862e-4f88-b0e9-7d98a681bb3e _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Dec  5 07:45:06 np0005546954 nova_compute[187160]: 2025-12-05 12:45:06.904 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Triggering sync for uuid 0513a02c-7fe2-43aa-9bd6-020014460672 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Dec  5 07:45:06 np0005546954 nova_compute[187160]: 2025-12-05 12:45:06.904 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Triggering sync for uuid bd45ae9f-9649-4347-a5e4-658d02804ef9 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Dec  5 07:45:06 np0005546954 nova_compute[187160]: 2025-12-05 12:45:06.905 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Triggering sync for uuid a133dad5-02c4-4021-90e5-ee9f3322f351 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Dec  5 07:45:06 np0005546954 nova_compute[187160]: 2025-12-05 12:45:06.906 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "2bf13a3e-bb2a-45f0-893e-0eb33fedb85e" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:45:06 np0005546954 nova_compute[187160]: 2025-12-05 12:45:06.906 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "2bf13a3e-bb2a-45f0-893e-0eb33fedb85e" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:45:06 np0005546954 nova_compute[187160]: 2025-12-05 12:45:06.906 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "dde775e5-862e-4f88-b0e9-7d98a681bb3e" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:45:06 np0005546954 nova_compute[187160]: 2025-12-05 12:45:06.907 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "dde775e5-862e-4f88-b0e9-7d98a681bb3e" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:45:06 np0005546954 nova_compute[187160]: 2025-12-05 12:45:06.907 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "0513a02c-7fe2-43aa-9bd6-020014460672" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:45:06 np0005546954 nova_compute[187160]: 2025-12-05 12:45:06.907 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "0513a02c-7fe2-43aa-9bd6-020014460672" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:45:06 np0005546954 nova_compute[187160]: 2025-12-05 12:45:06.908 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "bd45ae9f-9649-4347-a5e4-658d02804ef9" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:45:06 np0005546954 nova_compute[187160]: 2025-12-05 12:45:06.908 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "bd45ae9f-9649-4347-a5e4-658d02804ef9" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:45:06 np0005546954 nova_compute[187160]: 2025-12-05 12:45:06.908 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "a133dad5-02c4-4021-90e5-ee9f3322f351" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:45:06 np0005546954 nova_compute[187160]: 2025-12-05 12:45:06.917 187164 DEBUG oslo_concurrency.lockutils [None req-65daf801-68fe-43d7-bad9-83543ccb37eb 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.238s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:45:06 np0005546954 nova_compute[187160]: 2025-12-05 12:45:06.945 187164 INFO nova.scheduler.client.report [None req-65daf801-68fe-43d7-bad9-83543ccb37eb 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Deleted allocations for instance a133dad5-02c4-4021-90e5-ee9f3322f351#033[00m
Dec  5 07:45:07 np0005546954 nova_compute[187160]: 2025-12-05 12:45:07.001 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "dde775e5-862e-4f88-b0e9-7d98a681bb3e" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.095s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:45:07 np0005546954 nova_compute[187160]: 2025-12-05 12:45:07.002 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "2bf13a3e-bb2a-45f0-893e-0eb33fedb85e" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.096s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:45:07 np0005546954 nova_compute[187160]: 2025-12-05 12:45:07.008 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "bd45ae9f-9649-4347-a5e4-658d02804ef9" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:45:07 np0005546954 nova_compute[187160]: 2025-12-05 12:45:07.036 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "0513a02c-7fe2-43aa-9bd6-020014460672" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:45:07 np0005546954 nova_compute[187160]: 2025-12-05 12:45:07.440 187164 DEBUG oslo_concurrency.lockutils [None req-65daf801-68fe-43d7-bad9-83543ccb37eb 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "a133dad5-02c4-4021-90e5-ee9f3322f351" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:45:07 np0005546954 nova_compute[187160]: 2025-12-05 12:45:07.443 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "a133dad5-02c4-4021-90e5-ee9f3322f351" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.535s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:45:07 np0005546954 nova_compute[187160]: 2025-12-05 12:45:07.536 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "a133dad5-02c4-4021-90e5-ee9f3322f351" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:45:07 np0005546954 podman[209702]: 2025-12-05 12:45:07.648422686 +0000 UTC m=+0.134196897 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3)
Dec  5 07:45:07 np0005546954 nova_compute[187160]: 2025-12-05 12:45:07.989 187164 DEBUG nova.compute.manager [req-50c70f59-9bb0-42e0-ac77-b025a6b46d52 req-c9be7d0a-ae0b-4871-84eb-3d31009d0225 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Received event network-vif-plugged-1f839a4f-9bf4-4b39-aca7-83959475d57e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:45:07 np0005546954 nova_compute[187160]: 2025-12-05 12:45:07.989 187164 DEBUG oslo_concurrency.lockutils [req-50c70f59-9bb0-42e0-ac77-b025a6b46d52 req-c9be7d0a-ae0b-4871-84eb-3d31009d0225 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "a133dad5-02c4-4021-90e5-ee9f3322f351-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:45:07 np0005546954 nova_compute[187160]: 2025-12-05 12:45:07.990 187164 DEBUG oslo_concurrency.lockutils [req-50c70f59-9bb0-42e0-ac77-b025a6b46d52 req-c9be7d0a-ae0b-4871-84eb-3d31009d0225 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "a133dad5-02c4-4021-90e5-ee9f3322f351-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:45:07 np0005546954 nova_compute[187160]: 2025-12-05 12:45:07.990 187164 DEBUG oslo_concurrency.lockutils [req-50c70f59-9bb0-42e0-ac77-b025a6b46d52 req-c9be7d0a-ae0b-4871-84eb-3d31009d0225 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "a133dad5-02c4-4021-90e5-ee9f3322f351-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:45:07 np0005546954 nova_compute[187160]: 2025-12-05 12:45:07.990 187164 DEBUG nova.compute.manager [req-50c70f59-9bb0-42e0-ac77-b025a6b46d52 req-c9be7d0a-ae0b-4871-84eb-3d31009d0225 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] No waiting events found dispatching network-vif-plugged-1f839a4f-9bf4-4b39-aca7-83959475d57e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:45:07 np0005546954 nova_compute[187160]: 2025-12-05 12:45:07.991 187164 WARNING nova.compute.manager [req-50c70f59-9bb0-42e0-ac77-b025a6b46d52 req-c9be7d0a-ae0b-4871-84eb-3d31009d0225 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Received unexpected event network-vif-plugged-1f839a4f-9bf4-4b39-aca7-83959475d57e for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:45:08 np0005546954 nova_compute[187160]: 2025-12-05 12:45:08.771 187164 DEBUG oslo_concurrency.lockutils [None req-5d729f1e-6dbf-4c4b-a657-2e0f917ffaf8 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Acquiring lock "bd45ae9f-9649-4347-a5e4-658d02804ef9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:45:08 np0005546954 nova_compute[187160]: 2025-12-05 12:45:08.771 187164 DEBUG oslo_concurrency.lockutils [None req-5d729f1e-6dbf-4c4b-a657-2e0f917ffaf8 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "bd45ae9f-9649-4347-a5e4-658d02804ef9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:45:08 np0005546954 nova_compute[187160]: 2025-12-05 12:45:08.772 187164 DEBUG oslo_concurrency.lockutils [None req-5d729f1e-6dbf-4c4b-a657-2e0f917ffaf8 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Acquiring lock "bd45ae9f-9649-4347-a5e4-658d02804ef9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:45:08 np0005546954 nova_compute[187160]: 2025-12-05 12:45:08.772 187164 DEBUG oslo_concurrency.lockutils [None req-5d729f1e-6dbf-4c4b-a657-2e0f917ffaf8 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "bd45ae9f-9649-4347-a5e4-658d02804ef9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:45:08 np0005546954 nova_compute[187160]: 2025-12-05 12:45:08.773 187164 DEBUG oslo_concurrency.lockutils [None req-5d729f1e-6dbf-4c4b-a657-2e0f917ffaf8 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "bd45ae9f-9649-4347-a5e4-658d02804ef9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:45:08 np0005546954 nova_compute[187160]: 2025-12-05 12:45:08.775 187164 INFO nova.compute.manager [None req-5d729f1e-6dbf-4c4b-a657-2e0f917ffaf8 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Terminating instance#033[00m
Dec  5 07:45:08 np0005546954 nova_compute[187160]: 2025-12-05 12:45:08.776 187164 DEBUG nova.compute.manager [None req-5d729f1e-6dbf-4c4b-a657-2e0f917ffaf8 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:45:08 np0005546954 kernel: tap655d2088-f1 (unregistering): left promiscuous mode
Dec  5 07:45:08 np0005546954 NetworkManager[55665]: <info>  [1764938708.8098] device (tap655d2088-f1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:45:08 np0005546954 nova_compute[187160]: 2025-12-05 12:45:08.812 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:08 np0005546954 ovn_controller[95566]: 2025-12-05T12:45:08Z|00054|binding|INFO|Releasing lport 655d2088-f140-4abd-a02f-295b2208fecf from this chassis (sb_readonly=0)
Dec  5 07:45:08 np0005546954 ovn_controller[95566]: 2025-12-05T12:45:08Z|00055|binding|INFO|Setting lport 655d2088-f140-4abd-a02f-295b2208fecf down in Southbound
Dec  5 07:45:08 np0005546954 ovn_controller[95566]: 2025-12-05T12:45:08Z|00056|binding|INFO|Removing iface tap655d2088-f1 ovn-installed in OVS
Dec  5 07:45:08 np0005546954 nova_compute[187160]: 2025-12-05 12:45:08.816 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:08 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:08.821 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5e:21:c0 10.100.0.10'], port_security=['fa:16:3e:5e:21:c0 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'bd45ae9f-9649-4347-a5e4-658d02804ef9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee43e901-b158-4dc0-894f-2384aef8b277', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6b5f383ed0484ca1bde081bf623dad4b', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'a865bea6-413e-4ecb-bace-2ec9005935f1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8c84cb49-52df-48a9-8d24-aff5b642e12a, chassis=[], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=655d2088-f140-4abd-a02f-295b2208fecf) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:45:08 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:08.822 104428 INFO neutron.agent.ovn.metadata.agent [-] Port 655d2088-f140-4abd-a02f-295b2208fecf in datapath ee43e901-b158-4dc0-894f-2384aef8b277 unbound from our chassis#033[00m
Dec  5 07:45:08 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:08.824 104428 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee43e901-b158-4dc0-894f-2384aef8b277#033[00m
Dec  5 07:45:08 np0005546954 nova_compute[187160]: 2025-12-05 12:45:08.845 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:08 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:08.852 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[9a11f694-d6d9-493a-8e5e-eefb119fbb34]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:45:08 np0005546954 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000005.scope: Deactivated successfully.
Dec  5 07:45:08 np0005546954 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000005.scope: Consumed 11.865s CPU time.
Dec  5 07:45:08 np0005546954 systemd-machined[153497]: Machine qemu-5-instance-00000005 terminated.
Dec  5 07:45:08 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:08.888 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[73a536fb-f3e4-4710-95cc-6c78d1bf00f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:45:08 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:08.892 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[2e541bbc-84ef-4e55-b90d-56bc4c5c6c98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:45:08 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:08.927 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[0202188e-4273-4abc-b21a-cdaf8968bc9a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:45:08 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:08.953 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[b268601d-1853-4ee6-890e-d826e1483f89]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee43e901-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:16:af:4f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 32, 'tx_packets': 15, 'rx_bytes': 1624, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 32, 'tx_packets': 15, 'rx_bytes': 1624, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 358945, 'reachable_time': 17578, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209735, 'error': None, 'target': 'ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:45:08 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:08.978 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[ab1bdd40-3df6-460f-8f7b-53d30a5cb2e5]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapee43e901-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 358963, 'tstamp': 358963}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209736, 'error': None, 'target': 'ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapee43e901-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 358968, 'tstamp': 358968}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209736, 'error': None, 'target': 'ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:45:08 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:08.980 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee43e901-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:45:08 np0005546954 nova_compute[187160]: 2025-12-05 12:45:08.982 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:08 np0005546954 nova_compute[187160]: 2025-12-05 12:45:08.988 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:08 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:08.988 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee43e901-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:45:08 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:08.989 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:45:08 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:08.989 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee43e901-b0, col_values=(('external_ids', {'iface-id': 'ff42a43f-b4ac-4be3-b747-f3c0a6e67328'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:45:08 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:08.989 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:45:09 np0005546954 nova_compute[187160]: 2025-12-05 12:45:09.064 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:09 np0005546954 nova_compute[187160]: 2025-12-05 12:45:09.067 187164 INFO nova.virt.libvirt.driver [-] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Instance destroyed successfully.#033[00m
Dec  5 07:45:09 np0005546954 nova_compute[187160]: 2025-12-05 12:45:09.067 187164 DEBUG nova.objects.instance [None req-5d729f1e-6dbf-4c4b-a657-2e0f917ffaf8 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lazy-loading 'resources' on Instance uuid bd45ae9f-9649-4347-a5e4-658d02804ef9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:45:09 np0005546954 nova_compute[187160]: 2025-12-05 12:45:09.082 187164 DEBUG nova.virt.libvirt.vif [None req-5d729f1e-6dbf-4c4b-a657-2e0f917ffaf8 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:44:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-238469778',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-238469778',id=5,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:44:52Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6b5f383ed0484ca1bde081bf623dad4b',ramdisk_id='',reservation_id='r-8yye3j0y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1570363089',owner_user_name='tempest-TestExecuteActionsViaActuator-1570363089-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:44:59Z,user_data=None,user_id='7ce7ef64754e4a32b4af3272e31a4a5e',uuid=bd45ae9f-9649-4347-a5e4-658d02804ef9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "655d2088-f140-4abd-a02f-295b2208fecf", "address": "fa:16:3e:5e:21:c0", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap655d2088-f1", "ovs_interfaceid": "655d2088-f140-4abd-a02f-295b2208fecf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:45:09 np0005546954 nova_compute[187160]: 2025-12-05 12:45:09.083 187164 DEBUG nova.network.os_vif_util [None req-5d729f1e-6dbf-4c4b-a657-2e0f917ffaf8 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Converting VIF {"id": "655d2088-f140-4abd-a02f-295b2208fecf", "address": "fa:16:3e:5e:21:c0", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap655d2088-f1", "ovs_interfaceid": "655d2088-f140-4abd-a02f-295b2208fecf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:45:09 np0005546954 nova_compute[187160]: 2025-12-05 12:45:09.084 187164 DEBUG nova.network.os_vif_util [None req-5d729f1e-6dbf-4c4b-a657-2e0f917ffaf8 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5e:21:c0,bridge_name='br-int',has_traffic_filtering=True,id=655d2088-f140-4abd-a02f-295b2208fecf,network=Network(ee43e901-b158-4dc0-894f-2384aef8b277),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap655d2088-f1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:45:09 np0005546954 nova_compute[187160]: 2025-12-05 12:45:09.084 187164 DEBUG os_vif [None req-5d729f1e-6dbf-4c4b-a657-2e0f917ffaf8 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5e:21:c0,bridge_name='br-int',has_traffic_filtering=True,id=655d2088-f140-4abd-a02f-295b2208fecf,network=Network(ee43e901-b158-4dc0-894f-2384aef8b277),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap655d2088-f1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:45:09 np0005546954 nova_compute[187160]: 2025-12-05 12:45:09.086 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:09 np0005546954 nova_compute[187160]: 2025-12-05 12:45:09.087 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap655d2088-f1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:45:09 np0005546954 nova_compute[187160]: 2025-12-05 12:45:09.089 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:09 np0005546954 nova_compute[187160]: 2025-12-05 12:45:09.092 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:45:09 np0005546954 nova_compute[187160]: 2025-12-05 12:45:09.095 187164 INFO os_vif [None req-5d729f1e-6dbf-4c4b-a657-2e0f917ffaf8 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5e:21:c0,bridge_name='br-int',has_traffic_filtering=True,id=655d2088-f140-4abd-a02f-295b2208fecf,network=Network(ee43e901-b158-4dc0-894f-2384aef8b277),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap655d2088-f1')#033[00m
Dec  5 07:45:09 np0005546954 nova_compute[187160]: 2025-12-05 12:45:09.096 187164 INFO nova.virt.libvirt.driver [None req-5d729f1e-6dbf-4c4b-a657-2e0f917ffaf8 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Deleting instance files /var/lib/nova/instances/bd45ae9f-9649-4347-a5e4-658d02804ef9_del#033[00m
Dec  5 07:45:09 np0005546954 nova_compute[187160]: 2025-12-05 12:45:09.104 187164 INFO nova.virt.libvirt.driver [None req-5d729f1e-6dbf-4c4b-a657-2e0f917ffaf8 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Deletion of /var/lib/nova/instances/bd45ae9f-9649-4347-a5e4-658d02804ef9_del complete#033[00m
Dec  5 07:45:09 np0005546954 nova_compute[187160]: 2025-12-05 12:45:09.172 187164 INFO nova.compute.manager [None req-5d729f1e-6dbf-4c4b-a657-2e0f917ffaf8 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:45:09 np0005546954 nova_compute[187160]: 2025-12-05 12:45:09.173 187164 DEBUG oslo.service.loopingcall [None req-5d729f1e-6dbf-4c4b-a657-2e0f917ffaf8 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:45:09 np0005546954 nova_compute[187160]: 2025-12-05 12:45:09.173 187164 DEBUG nova.compute.manager [-] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:45:09 np0005546954 nova_compute[187160]: 2025-12-05 12:45:09.173 187164 DEBUG nova.network.neutron [-] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:45:09 np0005546954 nova_compute[187160]: 2025-12-05 12:45:09.692 187164 DEBUG nova.network.neutron [-] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:45:09 np0005546954 nova_compute[187160]: 2025-12-05 12:45:09.713 187164 INFO nova.compute.manager [-] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Took 0.54 seconds to deallocate network for instance.#033[00m
Dec  5 07:45:09 np0005546954 nova_compute[187160]: 2025-12-05 12:45:09.750 187164 DEBUG nova.compute.manager [req-9bb63ec7-5256-447f-8922-d8c385dbf736 req-cbe2a935-6067-49dc-b29c-0b3e0988d976 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Received event network-vif-deleted-655d2088-f140-4abd-a02f-295b2208fecf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:45:09 np0005546954 nova_compute[187160]: 2025-12-05 12:45:09.772 187164 DEBUG oslo_concurrency.lockutils [None req-5d729f1e-6dbf-4c4b-a657-2e0f917ffaf8 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:45:09 np0005546954 nova_compute[187160]: 2025-12-05 12:45:09.773 187164 DEBUG oslo_concurrency.lockutils [None req-5d729f1e-6dbf-4c4b-a657-2e0f917ffaf8 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:45:09 np0005546954 nova_compute[187160]: 2025-12-05 12:45:09.872 187164 DEBUG nova.compute.provider_tree [None req-5d729f1e-6dbf-4c4b-a657-2e0f917ffaf8 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:45:09 np0005546954 nova_compute[187160]: 2025-12-05 12:45:09.894 187164 DEBUG nova.scheduler.client.report [None req-5d729f1e-6dbf-4c4b-a657-2e0f917ffaf8 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:45:09 np0005546954 nova_compute[187160]: 2025-12-05 12:45:09.912 187164 DEBUG oslo_concurrency.lockutils [None req-5d729f1e-6dbf-4c4b-a657-2e0f917ffaf8 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:45:09 np0005546954 nova_compute[187160]: 2025-12-05 12:45:09.939 187164 INFO nova.scheduler.client.report [None req-5d729f1e-6dbf-4c4b-a657-2e0f917ffaf8 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Deleted allocations for instance bd45ae9f-9649-4347-a5e4-658d02804ef9#033[00m
Dec  5 07:45:10 np0005546954 nova_compute[187160]: 2025-12-05 12:45:10.008 187164 DEBUG oslo_concurrency.lockutils [None req-5d729f1e-6dbf-4c4b-a657-2e0f917ffaf8 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "bd45ae9f-9649-4347-a5e4-658d02804ef9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.237s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:45:10 np0005546954 nova_compute[187160]: 2025-12-05 12:45:10.061 187164 DEBUG nova.compute.manager [req-cdd26766-95d7-4759-bfff-69dcf96c1acc req-a6905ef3-d4bb-4e94-a84e-0a471e856865 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Received event network-vif-unplugged-655d2088-f140-4abd-a02f-295b2208fecf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:45:10 np0005546954 nova_compute[187160]: 2025-12-05 12:45:10.062 187164 DEBUG oslo_concurrency.lockutils [req-cdd26766-95d7-4759-bfff-69dcf96c1acc req-a6905ef3-d4bb-4e94-a84e-0a471e856865 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "bd45ae9f-9649-4347-a5e4-658d02804ef9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:45:10 np0005546954 nova_compute[187160]: 2025-12-05 12:45:10.062 187164 DEBUG oslo_concurrency.lockutils [req-cdd26766-95d7-4759-bfff-69dcf96c1acc req-a6905ef3-d4bb-4e94-a84e-0a471e856865 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "bd45ae9f-9649-4347-a5e4-658d02804ef9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:45:10 np0005546954 nova_compute[187160]: 2025-12-05 12:45:10.062 187164 DEBUG oslo_concurrency.lockutils [req-cdd26766-95d7-4759-bfff-69dcf96c1acc req-a6905ef3-d4bb-4e94-a84e-0a471e856865 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "bd45ae9f-9649-4347-a5e4-658d02804ef9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:45:10 np0005546954 nova_compute[187160]: 2025-12-05 12:45:10.062 187164 DEBUG nova.compute.manager [req-cdd26766-95d7-4759-bfff-69dcf96c1acc req-a6905ef3-d4bb-4e94-a84e-0a471e856865 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] No waiting events found dispatching network-vif-unplugged-655d2088-f140-4abd-a02f-295b2208fecf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:45:10 np0005546954 nova_compute[187160]: 2025-12-05 12:45:10.063 187164 WARNING nova.compute.manager [req-cdd26766-95d7-4759-bfff-69dcf96c1acc req-a6905ef3-d4bb-4e94-a84e-0a471e856865 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Received unexpected event network-vif-unplugged-655d2088-f140-4abd-a02f-295b2208fecf for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:45:10 np0005546954 nova_compute[187160]: 2025-12-05 12:45:10.063 187164 DEBUG nova.compute.manager [req-cdd26766-95d7-4759-bfff-69dcf96c1acc req-a6905ef3-d4bb-4e94-a84e-0a471e856865 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Received event network-vif-plugged-655d2088-f140-4abd-a02f-295b2208fecf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:45:10 np0005546954 nova_compute[187160]: 2025-12-05 12:45:10.063 187164 DEBUG oslo_concurrency.lockutils [req-cdd26766-95d7-4759-bfff-69dcf96c1acc req-a6905ef3-d4bb-4e94-a84e-0a471e856865 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "bd45ae9f-9649-4347-a5e4-658d02804ef9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:45:10 np0005546954 nova_compute[187160]: 2025-12-05 12:45:10.063 187164 DEBUG oslo_concurrency.lockutils [req-cdd26766-95d7-4759-bfff-69dcf96c1acc req-a6905ef3-d4bb-4e94-a84e-0a471e856865 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "bd45ae9f-9649-4347-a5e4-658d02804ef9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:45:10 np0005546954 nova_compute[187160]: 2025-12-05 12:45:10.063 187164 DEBUG oslo_concurrency.lockutils [req-cdd26766-95d7-4759-bfff-69dcf96c1acc req-a6905ef3-d4bb-4e94-a84e-0a471e856865 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "bd45ae9f-9649-4347-a5e4-658d02804ef9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:45:10 np0005546954 nova_compute[187160]: 2025-12-05 12:45:10.064 187164 DEBUG nova.compute.manager [req-cdd26766-95d7-4759-bfff-69dcf96c1acc req-a6905ef3-d4bb-4e94-a84e-0a471e856865 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] No waiting events found dispatching network-vif-plugged-655d2088-f140-4abd-a02f-295b2208fecf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:45:10 np0005546954 nova_compute[187160]: 2025-12-05 12:45:10.064 187164 WARNING nova.compute.manager [req-cdd26766-95d7-4759-bfff-69dcf96c1acc req-a6905ef3-d4bb-4e94-a84e-0a471e856865 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Received unexpected event network-vif-plugged-655d2088-f140-4abd-a02f-295b2208fecf for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:45:10 np0005546954 nova_compute[187160]: 2025-12-05 12:45:10.523 187164 DEBUG oslo_concurrency.lockutils [None req-1f5b249c-524a-442f-9870-0dc79452a1a2 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Acquiring lock "0513a02c-7fe2-43aa-9bd6-020014460672" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:45:10 np0005546954 nova_compute[187160]: 2025-12-05 12:45:10.523 187164 DEBUG oslo_concurrency.lockutils [None req-1f5b249c-524a-442f-9870-0dc79452a1a2 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "0513a02c-7fe2-43aa-9bd6-020014460672" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:45:10 np0005546954 nova_compute[187160]: 2025-12-05 12:45:10.524 187164 DEBUG oslo_concurrency.lockutils [None req-1f5b249c-524a-442f-9870-0dc79452a1a2 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Acquiring lock "0513a02c-7fe2-43aa-9bd6-020014460672-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:45:10 np0005546954 nova_compute[187160]: 2025-12-05 12:45:10.524 187164 DEBUG oslo_concurrency.lockutils [None req-1f5b249c-524a-442f-9870-0dc79452a1a2 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "0513a02c-7fe2-43aa-9bd6-020014460672-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:45:10 np0005546954 nova_compute[187160]: 2025-12-05 12:45:10.524 187164 DEBUG oslo_concurrency.lockutils [None req-1f5b249c-524a-442f-9870-0dc79452a1a2 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "0513a02c-7fe2-43aa-9bd6-020014460672-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:45:10 np0005546954 nova_compute[187160]: 2025-12-05 12:45:10.528 187164 INFO nova.compute.manager [None req-1f5b249c-524a-442f-9870-0dc79452a1a2 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Terminating instance#033[00m
Dec  5 07:45:10 np0005546954 nova_compute[187160]: 2025-12-05 12:45:10.530 187164 DEBUG nova.compute.manager [None req-1f5b249c-524a-442f-9870-0dc79452a1a2 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:45:10 np0005546954 kernel: tap0f6a798c-c1 (unregistering): left promiscuous mode
Dec  5 07:45:10 np0005546954 NetworkManager[55665]: <info>  [1764938710.5655] device (tap0f6a798c-c1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:45:10 np0005546954 ovn_controller[95566]: 2025-12-05T12:45:10Z|00057|binding|INFO|Releasing lport 0f6a798c-c13a-409c-8274-1b8ad42ad19b from this chassis (sb_readonly=0)
Dec  5 07:45:10 np0005546954 ovn_controller[95566]: 2025-12-05T12:45:10Z|00058|binding|INFO|Setting lport 0f6a798c-c13a-409c-8274-1b8ad42ad19b down in Southbound
Dec  5 07:45:10 np0005546954 ovn_controller[95566]: 2025-12-05T12:45:10Z|00059|binding|INFO|Removing iface tap0f6a798c-c1 ovn-installed in OVS
Dec  5 07:45:10 np0005546954 nova_compute[187160]: 2025-12-05 12:45:10.570 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:10 np0005546954 nova_compute[187160]: 2025-12-05 12:45:10.574 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:10 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:10.580 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:8f:94 10.100.0.4'], port_security=['fa:16:3e:1f:8f:94 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '0513a02c-7fe2-43aa-9bd6-020014460672', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee43e901-b158-4dc0-894f-2384aef8b277', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6b5f383ed0484ca1bde081bf623dad4b', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a865bea6-413e-4ecb-bace-2ec9005935f1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8c84cb49-52df-48a9-8d24-aff5b642e12a, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=0f6a798c-c13a-409c-8274-1b8ad42ad19b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:45:10 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:10.582 104428 INFO neutron.agent.ovn.metadata.agent [-] Port 0f6a798c-c13a-409c-8274-1b8ad42ad19b in datapath ee43e901-b158-4dc0-894f-2384aef8b277 unbound from our chassis#033[00m
Dec  5 07:45:10 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:10.586 104428 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee43e901-b158-4dc0-894f-2384aef8b277#033[00m
Dec  5 07:45:10 np0005546954 nova_compute[187160]: 2025-12-05 12:45:10.588 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:10 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:10.611 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[56d3f0d8-a805-472e-8e70-ea65ca88f397]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:45:10 np0005546954 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000004.scope: Deactivated successfully.
Dec  5 07:45:10 np0005546954 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000004.scope: Consumed 15.463s CPU time.
Dec  5 07:45:10 np0005546954 systemd-machined[153497]: Machine qemu-2-instance-00000004 terminated.
Dec  5 07:45:10 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:10.652 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[8ce2251a-8ee3-42fd-be28-e5d7a458cd2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:45:10 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:10.656 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[6b78adb4-a2b6-4cfa-9e23-86e1226129d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:45:10 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:10.693 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[353f11cb-c85f-49e3-954e-e32126ea6999]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:45:10 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:10.711 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[3558274d-8d24-4314-b14d-22633104b772]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee43e901-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:16:af:4f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 32, 'tx_packets': 17, 'rx_bytes': 1624, 'tx_bytes': 858, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 32, 'tx_packets': 17, 'rx_bytes': 1624, 'tx_bytes': 858, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 358945, 'reachable_time': 17578, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209765, 'error': None, 'target': 'ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:45:10 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:10.730 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[bd1a9720-3166-450d-91a8-a8c594606f3c]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapee43e901-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 358963, 'tstamp': 358963}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209766, 'error': None, 'target': 'ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapee43e901-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 358968, 'tstamp': 358968}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209766, 'error': None, 'target': 'ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:45:10 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:10.731 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee43e901-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:45:10 np0005546954 nova_compute[187160]: 2025-12-05 12:45:10.733 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:10 np0005546954 nova_compute[187160]: 2025-12-05 12:45:10.737 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:10 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:10.738 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee43e901-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:45:10 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:10.738 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:45:10 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:10.738 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee43e901-b0, col_values=(('external_ids', {'iface-id': 'ff42a43f-b4ac-4be3-b747-f3c0a6e67328'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:45:10 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:10.739 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:45:10 np0005546954 nova_compute[187160]: 2025-12-05 12:45:10.799 187164 INFO nova.virt.libvirt.driver [-] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Instance destroyed successfully.#033[00m
Dec  5 07:45:10 np0005546954 nova_compute[187160]: 2025-12-05 12:45:10.799 187164 DEBUG nova.objects.instance [None req-1f5b249c-524a-442f-9870-0dc79452a1a2 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lazy-loading 'resources' on Instance uuid 0513a02c-7fe2-43aa-9bd6-020014460672 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:45:10 np0005546954 nova_compute[187160]: 2025-12-05 12:45:10.816 187164 DEBUG nova.virt.libvirt.vif [None req-1f5b249c-524a-442f-9870-0dc79452a1a2 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:43:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1665104635',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1665104635',id=4,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:43:59Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6b5f383ed0484ca1bde081bf623dad4b',ramdisk_id='',reservation_id='r-3hvxvigf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1570363089',owner_user_name='tempest-TestExecuteActionsViaActuator-1570363089-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:43:59Z,user_data=None,user_id='7ce7ef64754e4a32b4af3272e31a4a5e',uuid=0513a02c-7fe2-43aa-9bd6-020014460672,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0f6a798c-c13a-409c-8274-1b8ad42ad19b", "address": "fa:16:3e:1f:8f:94", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f6a798c-c1", "ovs_interfaceid": "0f6a798c-c13a-409c-8274-1b8ad42ad19b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:45:10 np0005546954 nova_compute[187160]: 2025-12-05 12:45:10.816 187164 DEBUG nova.network.os_vif_util [None req-1f5b249c-524a-442f-9870-0dc79452a1a2 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Converting VIF {"id": "0f6a798c-c13a-409c-8274-1b8ad42ad19b", "address": "fa:16:3e:1f:8f:94", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f6a798c-c1", "ovs_interfaceid": "0f6a798c-c13a-409c-8274-1b8ad42ad19b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:45:10 np0005546954 nova_compute[187160]: 2025-12-05 12:45:10.817 187164 DEBUG nova.network.os_vif_util [None req-1f5b249c-524a-442f-9870-0dc79452a1a2 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1f:8f:94,bridge_name='br-int',has_traffic_filtering=True,id=0f6a798c-c13a-409c-8274-1b8ad42ad19b,network=Network(ee43e901-b158-4dc0-894f-2384aef8b277),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0f6a798c-c1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:45:10 np0005546954 nova_compute[187160]: 2025-12-05 12:45:10.817 187164 DEBUG os_vif [None req-1f5b249c-524a-442f-9870-0dc79452a1a2 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:8f:94,bridge_name='br-int',has_traffic_filtering=True,id=0f6a798c-c13a-409c-8274-1b8ad42ad19b,network=Network(ee43e901-b158-4dc0-894f-2384aef8b277),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0f6a798c-c1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:45:10 np0005546954 nova_compute[187160]: 2025-12-05 12:45:10.818 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:10 np0005546954 nova_compute[187160]: 2025-12-05 12:45:10.818 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0f6a798c-c1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:45:10 np0005546954 nova_compute[187160]: 2025-12-05 12:45:10.820 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:10 np0005546954 nova_compute[187160]: 2025-12-05 12:45:10.820 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:10 np0005546954 nova_compute[187160]: 2025-12-05 12:45:10.822 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:45:10 np0005546954 nova_compute[187160]: 2025-12-05 12:45:10.823 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:10 np0005546954 nova_compute[187160]: 2025-12-05 12:45:10.825 187164 INFO os_vif [None req-1f5b249c-524a-442f-9870-0dc79452a1a2 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:8f:94,bridge_name='br-int',has_traffic_filtering=True,id=0f6a798c-c13a-409c-8274-1b8ad42ad19b,network=Network(ee43e901-b158-4dc0-894f-2384aef8b277),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0f6a798c-c1')#033[00m
Dec  5 07:45:10 np0005546954 nova_compute[187160]: 2025-12-05 12:45:10.826 187164 INFO nova.virt.libvirt.driver [None req-1f5b249c-524a-442f-9870-0dc79452a1a2 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Deleting instance files /var/lib/nova/instances/0513a02c-7fe2-43aa-9bd6-020014460672_del#033[00m
Dec  5 07:45:10 np0005546954 nova_compute[187160]: 2025-12-05 12:45:10.826 187164 INFO nova.virt.libvirt.driver [None req-1f5b249c-524a-442f-9870-0dc79452a1a2 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Deletion of /var/lib/nova/instances/0513a02c-7fe2-43aa-9bd6-020014460672_del complete#033[00m
Dec  5 07:45:10 np0005546954 nova_compute[187160]: 2025-12-05 12:45:10.871 187164 INFO nova.compute.manager [None req-1f5b249c-524a-442f-9870-0dc79452a1a2 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Took 0.34 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:45:10 np0005546954 nova_compute[187160]: 2025-12-05 12:45:10.871 187164 DEBUG oslo.service.loopingcall [None req-1f5b249c-524a-442f-9870-0dc79452a1a2 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:45:10 np0005546954 nova_compute[187160]: 2025-12-05 12:45:10.872 187164 DEBUG nova.compute.manager [-] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:45:10 np0005546954 nova_compute[187160]: 2025-12-05 12:45:10.872 187164 DEBUG nova.network.neutron [-] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:45:11 np0005546954 nova_compute[187160]: 2025-12-05 12:45:11.399 187164 DEBUG nova.network.neutron [-] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:45:11 np0005546954 nova_compute[187160]: 2025-12-05 12:45:11.417 187164 INFO nova.compute.manager [-] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Took 0.55 seconds to deallocate network for instance.#033[00m
Dec  5 07:45:11 np0005546954 nova_compute[187160]: 2025-12-05 12:45:11.468 187164 DEBUG oslo_concurrency.lockutils [None req-1f5b249c-524a-442f-9870-0dc79452a1a2 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:45:11 np0005546954 nova_compute[187160]: 2025-12-05 12:45:11.469 187164 DEBUG oslo_concurrency.lockutils [None req-1f5b249c-524a-442f-9870-0dc79452a1a2 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:45:11 np0005546954 nova_compute[187160]: 2025-12-05 12:45:11.549 187164 DEBUG nova.compute.provider_tree [None req-1f5b249c-524a-442f-9870-0dc79452a1a2 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:45:11 np0005546954 nova_compute[187160]: 2025-12-05 12:45:11.563 187164 DEBUG nova.scheduler.client.report [None req-1f5b249c-524a-442f-9870-0dc79452a1a2 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:45:11 np0005546954 nova_compute[187160]: 2025-12-05 12:45:11.588 187164 DEBUG oslo_concurrency.lockutils [None req-1f5b249c-524a-442f-9870-0dc79452a1a2 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:45:11 np0005546954 nova_compute[187160]: 2025-12-05 12:45:11.611 187164 INFO nova.scheduler.client.report [None req-1f5b249c-524a-442f-9870-0dc79452a1a2 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Deleted allocations for instance 0513a02c-7fe2-43aa-9bd6-020014460672#033[00m
Dec  5 07:45:11 np0005546954 nova_compute[187160]: 2025-12-05 12:45:11.674 187164 DEBUG oslo_concurrency.lockutils [None req-1f5b249c-524a-442f-9870-0dc79452a1a2 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "0513a02c-7fe2-43aa-9bd6-020014460672" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.150s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:45:11 np0005546954 nova_compute[187160]: 2025-12-05 12:45:11.819 187164 DEBUG nova.compute.manager [req-62a6b3e5-58b0-47ec-8703-2f9f6826abab req-c1bd09b9-4d7b-403e-8a02-96e198de8973 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Received event network-vif-deleted-0f6a798c-c13a-409c-8274-1b8ad42ad19b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:45:12 np0005546954 nova_compute[187160]: 2025-12-05 12:45:12.137 187164 DEBUG nova.compute.manager [req-c758c42d-d6e5-420e-aa81-9f0e408c4f2c req-1946fbda-3ef2-47c6-bcc4-0ea1e708c653 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Received event network-vif-unplugged-0f6a798c-c13a-409c-8274-1b8ad42ad19b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:45:12 np0005546954 nova_compute[187160]: 2025-12-05 12:45:12.138 187164 DEBUG oslo_concurrency.lockutils [req-c758c42d-d6e5-420e-aa81-9f0e408c4f2c req-1946fbda-3ef2-47c6-bcc4-0ea1e708c653 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "0513a02c-7fe2-43aa-9bd6-020014460672-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:45:12 np0005546954 nova_compute[187160]: 2025-12-05 12:45:12.138 187164 DEBUG oslo_concurrency.lockutils [req-c758c42d-d6e5-420e-aa81-9f0e408c4f2c req-1946fbda-3ef2-47c6-bcc4-0ea1e708c653 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "0513a02c-7fe2-43aa-9bd6-020014460672-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:45:12 np0005546954 nova_compute[187160]: 2025-12-05 12:45:12.138 187164 DEBUG oslo_concurrency.lockutils [req-c758c42d-d6e5-420e-aa81-9f0e408c4f2c req-1946fbda-3ef2-47c6-bcc4-0ea1e708c653 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "0513a02c-7fe2-43aa-9bd6-020014460672-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:45:12 np0005546954 nova_compute[187160]: 2025-12-05 12:45:12.138 187164 DEBUG nova.compute.manager [req-c758c42d-d6e5-420e-aa81-9f0e408c4f2c req-1946fbda-3ef2-47c6-bcc4-0ea1e708c653 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] No waiting events found dispatching network-vif-unplugged-0f6a798c-c13a-409c-8274-1b8ad42ad19b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:45:12 np0005546954 nova_compute[187160]: 2025-12-05 12:45:12.139 187164 WARNING nova.compute.manager [req-c758c42d-d6e5-420e-aa81-9f0e408c4f2c req-1946fbda-3ef2-47c6-bcc4-0ea1e708c653 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Received unexpected event network-vif-unplugged-0f6a798c-c13a-409c-8274-1b8ad42ad19b for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:45:12 np0005546954 nova_compute[187160]: 2025-12-05 12:45:12.139 187164 DEBUG nova.compute.manager [req-c758c42d-d6e5-420e-aa81-9f0e408c4f2c req-1946fbda-3ef2-47c6-bcc4-0ea1e708c653 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Received event network-vif-plugged-0f6a798c-c13a-409c-8274-1b8ad42ad19b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:45:12 np0005546954 nova_compute[187160]: 2025-12-05 12:45:12.139 187164 DEBUG oslo_concurrency.lockutils [req-c758c42d-d6e5-420e-aa81-9f0e408c4f2c req-1946fbda-3ef2-47c6-bcc4-0ea1e708c653 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "0513a02c-7fe2-43aa-9bd6-020014460672-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:45:12 np0005546954 nova_compute[187160]: 2025-12-05 12:45:12.139 187164 DEBUG oslo_concurrency.lockutils [req-c758c42d-d6e5-420e-aa81-9f0e408c4f2c req-1946fbda-3ef2-47c6-bcc4-0ea1e708c653 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "0513a02c-7fe2-43aa-9bd6-020014460672-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:45:12 np0005546954 nova_compute[187160]: 2025-12-05 12:45:12.140 187164 DEBUG oslo_concurrency.lockutils [req-c758c42d-d6e5-420e-aa81-9f0e408c4f2c req-1946fbda-3ef2-47c6-bcc4-0ea1e708c653 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "0513a02c-7fe2-43aa-9bd6-020014460672-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:45:12 np0005546954 nova_compute[187160]: 2025-12-05 12:45:12.140 187164 DEBUG nova.compute.manager [req-c758c42d-d6e5-420e-aa81-9f0e408c4f2c req-1946fbda-3ef2-47c6-bcc4-0ea1e708c653 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] No waiting events found dispatching network-vif-plugged-0f6a798c-c13a-409c-8274-1b8ad42ad19b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:45:12 np0005546954 nova_compute[187160]: 2025-12-05 12:45:12.140 187164 WARNING nova.compute.manager [req-c758c42d-d6e5-420e-aa81-9f0e408c4f2c req-1946fbda-3ef2-47c6-bcc4-0ea1e708c653 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Received unexpected event network-vif-plugged-0f6a798c-c13a-409c-8274-1b8ad42ad19b for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:45:12 np0005546954 podman[209785]: 2025-12-05 12:45:12.575080144 +0000 UTC m=+0.073319843 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec  5 07:45:12 np0005546954 nova_compute[187160]: 2025-12-05 12:45:12.768 187164 DEBUG oslo_concurrency.lockutils [None req-3aee6fe3-d14e-4885-929a-1f4dbb630490 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Acquiring lock "dde775e5-862e-4f88-b0e9-7d98a681bb3e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:45:12 np0005546954 nova_compute[187160]: 2025-12-05 12:45:12.769 187164 DEBUG oslo_concurrency.lockutils [None req-3aee6fe3-d14e-4885-929a-1f4dbb630490 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "dde775e5-862e-4f88-b0e9-7d98a681bb3e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:45:12 np0005546954 nova_compute[187160]: 2025-12-05 12:45:12.769 187164 DEBUG oslo_concurrency.lockutils [None req-3aee6fe3-d14e-4885-929a-1f4dbb630490 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Acquiring lock "dde775e5-862e-4f88-b0e9-7d98a681bb3e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:45:12 np0005546954 nova_compute[187160]: 2025-12-05 12:45:12.769 187164 DEBUG oslo_concurrency.lockutils [None req-3aee6fe3-d14e-4885-929a-1f4dbb630490 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "dde775e5-862e-4f88-b0e9-7d98a681bb3e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:45:12 np0005546954 nova_compute[187160]: 2025-12-05 12:45:12.769 187164 DEBUG oslo_concurrency.lockutils [None req-3aee6fe3-d14e-4885-929a-1f4dbb630490 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "dde775e5-862e-4f88-b0e9-7d98a681bb3e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:45:12 np0005546954 nova_compute[187160]: 2025-12-05 12:45:12.771 187164 INFO nova.compute.manager [None req-3aee6fe3-d14e-4885-929a-1f4dbb630490 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: dde775e5-862e-4f88-b0e9-7d98a681bb3e] Terminating instance#033[00m
Dec  5 07:45:12 np0005546954 nova_compute[187160]: 2025-12-05 12:45:12.772 187164 DEBUG nova.compute.manager [None req-3aee6fe3-d14e-4885-929a-1f4dbb630490 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: dde775e5-862e-4f88-b0e9-7d98a681bb3e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:45:12 np0005546954 kernel: tap79860104-80 (unregistering): left promiscuous mode
Dec  5 07:45:12 np0005546954 NetworkManager[55665]: <info>  [1764938712.8033] device (tap79860104-80): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:45:12 np0005546954 ovn_controller[95566]: 2025-12-05T12:45:12Z|00060|binding|INFO|Releasing lport 79860104-80d8-4998-9c9d-057e3c980d6e from this chassis (sb_readonly=0)
Dec  5 07:45:12 np0005546954 ovn_controller[95566]: 2025-12-05T12:45:12Z|00061|binding|INFO|Setting lport 79860104-80d8-4998-9c9d-057e3c980d6e down in Southbound
Dec  5 07:45:12 np0005546954 nova_compute[187160]: 2025-12-05 12:45:12.807 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:12 np0005546954 ovn_controller[95566]: 2025-12-05T12:45:12Z|00062|binding|INFO|Removing iface tap79860104-80 ovn-installed in OVS
Dec  5 07:45:12 np0005546954 nova_compute[187160]: 2025-12-05 12:45:12.811 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:12.818 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e8:c4:c3 10.100.0.5'], port_security=['fa:16:3e:e8:c4:c3 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'dde775e5-862e-4f88-b0e9-7d98a681bb3e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee43e901-b158-4dc0-894f-2384aef8b277', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6b5f383ed0484ca1bde081bf623dad4b', 'neutron:revision_number': '13', 'neutron:security_group_ids': 'a865bea6-413e-4ecb-bace-2ec9005935f1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8c84cb49-52df-48a9-8d24-aff5b642e12a, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=79860104-80d8-4998-9c9d-057e3c980d6e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:45:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:12.819 104428 INFO neutron.agent.ovn.metadata.agent [-] Port 79860104-80d8-4998-9c9d-057e3c980d6e in datapath ee43e901-b158-4dc0-894f-2384aef8b277 unbound from our chassis#033[00m
Dec  5 07:45:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:12.820 104428 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee43e901-b158-4dc0-894f-2384aef8b277#033[00m
Dec  5 07:45:12 np0005546954 nova_compute[187160]: 2025-12-05 12:45:12.826 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:12.834 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[ac799f3f-898e-4114-95e8-3a5c81a436ce]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:45:12 np0005546954 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000003.scope: Deactivated successfully.
Dec  5 07:45:12 np0005546954 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000003.scope: Consumed 2.397s CPU time.
Dec  5 07:45:12 np0005546954 systemd-machined[153497]: Machine qemu-4-instance-00000003 terminated.
Dec  5 07:45:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:12.862 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[10006cae-0359-4713-8321-43a75fa16828]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:45:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:12.865 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[4acc14b1-33b5-44de-b144-c4d222fa58bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:45:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:12.898 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[c944f97f-6aaf-42c7-8f58-05d6ed523c09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:45:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:12.914 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[8123af98-4f3a-4473-b04d-d59f75e6a67a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee43e901-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:16:af:4f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 32, 'tx_packets': 19, 'rx_bytes': 1624, 'tx_bytes': 942, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 32, 'tx_packets': 19, 'rx_bytes': 1624, 'tx_bytes': 942, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 358945, 'reachable_time': 17578, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209822, 'error': None, 'target': 'ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:45:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:12.930 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[4551cc31-6730-4bfd-af41-b05b84b94fb8]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapee43e901-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 358963, 'tstamp': 358963}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209823, 'error': None, 'target': 'ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapee43e901-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 358968, 'tstamp': 358968}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209823, 'error': None, 'target': 'ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:45:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:12.933 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee43e901-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:45:12 np0005546954 nova_compute[187160]: 2025-12-05 12:45:12.935 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:12 np0005546954 nova_compute[187160]: 2025-12-05 12:45:12.941 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:12.942 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee43e901-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:45:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:12.942 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:45:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:12.943 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee43e901-b0, col_values=(('external_ids', {'iface-id': 'ff42a43f-b4ac-4be3-b747-f3c0a6e67328'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:45:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:12.943 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:45:13 np0005546954 nova_compute[187160]: 2025-12-05 12:45:13.044 187164 INFO nova.virt.libvirt.driver [-] [instance: dde775e5-862e-4f88-b0e9-7d98a681bb3e] Instance destroyed successfully.#033[00m
Dec  5 07:45:13 np0005546954 nova_compute[187160]: 2025-12-05 12:45:13.045 187164 DEBUG nova.objects.instance [None req-3aee6fe3-d14e-4885-929a-1f4dbb630490 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lazy-loading 'resources' on Instance uuid dde775e5-862e-4f88-b0e9-7d98a681bb3e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:45:13 np0005546954 nova_compute[187160]: 2025-12-05 12:45:13.062 187164 DEBUG nova.virt.libvirt.vif [None req-3aee6fe3-d14e-4885-929a-1f4dbb630490 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T12:43:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-364941227',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-364941227',id=3,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:43:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6b5f383ed0484ca1bde081bf623dad4b',ramdisk_id='',reservation_id='r-ewuwj2i6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1570363089',owner_user_name='tempest-TestExecuteActionsViaActuator-1570363089-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:44:51Z,user_data=None,user_id='7ce7ef64754e4a32b4af3272e31a4a5e',uuid=dde775e5-862e-4f88-b0e9-7d98a681bb3e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "79860104-80d8-4998-9c9d-057e3c980d6e", "address": "fa:16:3e:e8:c4:c3", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap79860104-80", "ovs_interfaceid": "79860104-80d8-4998-9c9d-057e3c980d6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:45:13 np0005546954 nova_compute[187160]: 2025-12-05 12:45:13.062 187164 DEBUG nova.network.os_vif_util [None req-3aee6fe3-d14e-4885-929a-1f4dbb630490 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Converting VIF {"id": "79860104-80d8-4998-9c9d-057e3c980d6e", "address": "fa:16:3e:e8:c4:c3", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap79860104-80", "ovs_interfaceid": "79860104-80d8-4998-9c9d-057e3c980d6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:45:13 np0005546954 nova_compute[187160]: 2025-12-05 12:45:13.063 187164 DEBUG nova.network.os_vif_util [None req-3aee6fe3-d14e-4885-929a-1f4dbb630490 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e8:c4:c3,bridge_name='br-int',has_traffic_filtering=True,id=79860104-80d8-4998-9c9d-057e3c980d6e,network=Network(ee43e901-b158-4dc0-894f-2384aef8b277),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap79860104-80') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:45:13 np0005546954 nova_compute[187160]: 2025-12-05 12:45:13.063 187164 DEBUG os_vif [None req-3aee6fe3-d14e-4885-929a-1f4dbb630490 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e8:c4:c3,bridge_name='br-int',has_traffic_filtering=True,id=79860104-80d8-4998-9c9d-057e3c980d6e,network=Network(ee43e901-b158-4dc0-894f-2384aef8b277),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap79860104-80') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:45:13 np0005546954 nova_compute[187160]: 2025-12-05 12:45:13.065 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:13 np0005546954 nova_compute[187160]: 2025-12-05 12:45:13.066 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap79860104-80, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:45:13 np0005546954 nova_compute[187160]: 2025-12-05 12:45:13.069 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:45:13 np0005546954 nova_compute[187160]: 2025-12-05 12:45:13.072 187164 INFO os_vif [None req-3aee6fe3-d14e-4885-929a-1f4dbb630490 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e8:c4:c3,bridge_name='br-int',has_traffic_filtering=True,id=79860104-80d8-4998-9c9d-057e3c980d6e,network=Network(ee43e901-b158-4dc0-894f-2384aef8b277),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap79860104-80')#033[00m
Dec  5 07:45:13 np0005546954 nova_compute[187160]: 2025-12-05 12:45:13.073 187164 INFO nova.virt.libvirt.driver [None req-3aee6fe3-d14e-4885-929a-1f4dbb630490 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: dde775e5-862e-4f88-b0e9-7d98a681bb3e] Deleting instance files /var/lib/nova/instances/dde775e5-862e-4f88-b0e9-7d98a681bb3e_del#033[00m
Dec  5 07:45:13 np0005546954 nova_compute[187160]: 2025-12-05 12:45:13.073 187164 INFO nova.virt.libvirt.driver [None req-3aee6fe3-d14e-4885-929a-1f4dbb630490 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: dde775e5-862e-4f88-b0e9-7d98a681bb3e] Deletion of /var/lib/nova/instances/dde775e5-862e-4f88-b0e9-7d98a681bb3e_del complete#033[00m
Dec  5 07:45:13 np0005546954 nova_compute[187160]: 2025-12-05 12:45:13.133 187164 INFO nova.compute.manager [None req-3aee6fe3-d14e-4885-929a-1f4dbb630490 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: dde775e5-862e-4f88-b0e9-7d98a681bb3e] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:45:13 np0005546954 nova_compute[187160]: 2025-12-05 12:45:13.135 187164 DEBUG oslo.service.loopingcall [None req-3aee6fe3-d14e-4885-929a-1f4dbb630490 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:45:13 np0005546954 nova_compute[187160]: 2025-12-05 12:45:13.135 187164 DEBUG nova.compute.manager [-] [instance: dde775e5-862e-4f88-b0e9-7d98a681bb3e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:45:13 np0005546954 nova_compute[187160]: 2025-12-05 12:45:13.135 187164 DEBUG nova.network.neutron [-] [instance: dde775e5-862e-4f88-b0e9-7d98a681bb3e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:45:13 np0005546954 podman[209840]: 2025-12-05 12:45:13.159180321 +0000 UTC m=+0.104836247 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  5 07:45:13 np0005546954 nova_compute[187160]: 2025-12-05 12:45:13.837 187164 DEBUG nova.network.neutron [-] [instance: dde775e5-862e-4f88-b0e9-7d98a681bb3e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:45:13 np0005546954 nova_compute[187160]: 2025-12-05 12:45:13.858 187164 INFO nova.compute.manager [-] [instance: dde775e5-862e-4f88-b0e9-7d98a681bb3e] Took 0.72 seconds to deallocate network for instance.#033[00m
Dec  5 07:45:13 np0005546954 nova_compute[187160]: 2025-12-05 12:45:13.893 187164 DEBUG nova.compute.manager [req-d065eeff-e84c-425c-b2ef-6214c293b6a5 req-c1e176f0-788a-40d2-bbe6-419524c7e524 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: dde775e5-862e-4f88-b0e9-7d98a681bb3e] Received event network-vif-unplugged-79860104-80d8-4998-9c9d-057e3c980d6e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:45:13 np0005546954 nova_compute[187160]: 2025-12-05 12:45:13.894 187164 DEBUG oslo_concurrency.lockutils [req-d065eeff-e84c-425c-b2ef-6214c293b6a5 req-c1e176f0-788a-40d2-bbe6-419524c7e524 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "dde775e5-862e-4f88-b0e9-7d98a681bb3e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:45:13 np0005546954 nova_compute[187160]: 2025-12-05 12:45:13.894 187164 DEBUG oslo_concurrency.lockutils [req-d065eeff-e84c-425c-b2ef-6214c293b6a5 req-c1e176f0-788a-40d2-bbe6-419524c7e524 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "dde775e5-862e-4f88-b0e9-7d98a681bb3e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:45:13 np0005546954 nova_compute[187160]: 2025-12-05 12:45:13.894 187164 DEBUG oslo_concurrency.lockutils [req-d065eeff-e84c-425c-b2ef-6214c293b6a5 req-c1e176f0-788a-40d2-bbe6-419524c7e524 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "dde775e5-862e-4f88-b0e9-7d98a681bb3e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:45:13 np0005546954 nova_compute[187160]: 2025-12-05 12:45:13.895 187164 DEBUG nova.compute.manager [req-d065eeff-e84c-425c-b2ef-6214c293b6a5 req-c1e176f0-788a-40d2-bbe6-419524c7e524 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: dde775e5-862e-4f88-b0e9-7d98a681bb3e] No waiting events found dispatching network-vif-unplugged-79860104-80d8-4998-9c9d-057e3c980d6e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:45:13 np0005546954 nova_compute[187160]: 2025-12-05 12:45:13.895 187164 DEBUG nova.compute.manager [req-d065eeff-e84c-425c-b2ef-6214c293b6a5 req-c1e176f0-788a-40d2-bbe6-419524c7e524 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: dde775e5-862e-4f88-b0e9-7d98a681bb3e] Received event network-vif-unplugged-79860104-80d8-4998-9c9d-057e3c980d6e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  5 07:45:13 np0005546954 nova_compute[187160]: 2025-12-05 12:45:13.895 187164 DEBUG nova.compute.manager [req-d065eeff-e84c-425c-b2ef-6214c293b6a5 req-c1e176f0-788a-40d2-bbe6-419524c7e524 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: dde775e5-862e-4f88-b0e9-7d98a681bb3e] Received event network-vif-plugged-79860104-80d8-4998-9c9d-057e3c980d6e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:45:13 np0005546954 nova_compute[187160]: 2025-12-05 12:45:13.895 187164 DEBUG oslo_concurrency.lockutils [req-d065eeff-e84c-425c-b2ef-6214c293b6a5 req-c1e176f0-788a-40d2-bbe6-419524c7e524 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "dde775e5-862e-4f88-b0e9-7d98a681bb3e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:45:13 np0005546954 nova_compute[187160]: 2025-12-05 12:45:13.896 187164 DEBUG oslo_concurrency.lockutils [req-d065eeff-e84c-425c-b2ef-6214c293b6a5 req-c1e176f0-788a-40d2-bbe6-419524c7e524 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "dde775e5-862e-4f88-b0e9-7d98a681bb3e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:45:13 np0005546954 nova_compute[187160]: 2025-12-05 12:45:13.896 187164 DEBUG oslo_concurrency.lockutils [req-d065eeff-e84c-425c-b2ef-6214c293b6a5 req-c1e176f0-788a-40d2-bbe6-419524c7e524 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "dde775e5-862e-4f88-b0e9-7d98a681bb3e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:45:13 np0005546954 nova_compute[187160]: 2025-12-05 12:45:13.896 187164 DEBUG nova.compute.manager [req-d065eeff-e84c-425c-b2ef-6214c293b6a5 req-c1e176f0-788a-40d2-bbe6-419524c7e524 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: dde775e5-862e-4f88-b0e9-7d98a681bb3e] No waiting events found dispatching network-vif-plugged-79860104-80d8-4998-9c9d-057e3c980d6e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:45:13 np0005546954 nova_compute[187160]: 2025-12-05 12:45:13.897 187164 WARNING nova.compute.manager [req-d065eeff-e84c-425c-b2ef-6214c293b6a5 req-c1e176f0-788a-40d2-bbe6-419524c7e524 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: dde775e5-862e-4f88-b0e9-7d98a681bb3e] Received unexpected event network-vif-plugged-79860104-80d8-4998-9c9d-057e3c980d6e for instance with vm_state active and task_state deleting.#033[00m
Dec  5 07:45:13 np0005546954 nova_compute[187160]: 2025-12-05 12:45:13.897 187164 DEBUG nova.compute.manager [req-d065eeff-e84c-425c-b2ef-6214c293b6a5 req-c1e176f0-788a-40d2-bbe6-419524c7e524 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: dde775e5-862e-4f88-b0e9-7d98a681bb3e] Received event network-vif-deleted-79860104-80d8-4998-9c9d-057e3c980d6e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:45:13 np0005546954 nova_compute[187160]: 2025-12-05 12:45:13.929 187164 DEBUG oslo_concurrency.lockutils [None req-3aee6fe3-d14e-4885-929a-1f4dbb630490 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:45:13 np0005546954 nova_compute[187160]: 2025-12-05 12:45:13.930 187164 DEBUG oslo_concurrency.lockutils [None req-3aee6fe3-d14e-4885-929a-1f4dbb630490 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:45:13 np0005546954 nova_compute[187160]: 2025-12-05 12:45:13.994 187164 DEBUG nova.compute.provider_tree [None req-3aee6fe3-d14e-4885-929a-1f4dbb630490 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:45:14 np0005546954 nova_compute[187160]: 2025-12-05 12:45:14.019 187164 DEBUG nova.scheduler.client.report [None req-3aee6fe3-d14e-4885-929a-1f4dbb630490 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:45:14 np0005546954 nova_compute[187160]: 2025-12-05 12:45:14.042 187164 DEBUG oslo_concurrency.lockutils [None req-3aee6fe3-d14e-4885-929a-1f4dbb630490 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:45:14 np0005546954 nova_compute[187160]: 2025-12-05 12:45:14.066 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:14 np0005546954 nova_compute[187160]: 2025-12-05 12:45:14.078 187164 INFO nova.scheduler.client.report [None req-3aee6fe3-d14e-4885-929a-1f4dbb630490 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Deleted allocations for instance dde775e5-862e-4f88-b0e9-7d98a681bb3e#033[00m
Dec  5 07:45:14 np0005546954 nova_compute[187160]: 2025-12-05 12:45:14.209 187164 DEBUG oslo_concurrency.lockutils [None req-3aee6fe3-d14e-4885-929a-1f4dbb630490 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "dde775e5-862e-4f88-b0e9-7d98a681bb3e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.441s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:45:15 np0005546954 nova_compute[187160]: 2025-12-05 12:45:15.187 187164 DEBUG oslo_concurrency.lockutils [None req-2dd415f1-283d-420d-b769-07c0d2a0512f 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Acquiring lock "2bf13a3e-bb2a-45f0-893e-0eb33fedb85e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:45:15 np0005546954 nova_compute[187160]: 2025-12-05 12:45:15.188 187164 DEBUG oslo_concurrency.lockutils [None req-2dd415f1-283d-420d-b769-07c0d2a0512f 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "2bf13a3e-bb2a-45f0-893e-0eb33fedb85e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:45:15 np0005546954 nova_compute[187160]: 2025-12-05 12:45:15.188 187164 DEBUG oslo_concurrency.lockutils [None req-2dd415f1-283d-420d-b769-07c0d2a0512f 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Acquiring lock "2bf13a3e-bb2a-45f0-893e-0eb33fedb85e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:45:15 np0005546954 nova_compute[187160]: 2025-12-05 12:45:15.188 187164 DEBUG oslo_concurrency.lockutils [None req-2dd415f1-283d-420d-b769-07c0d2a0512f 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "2bf13a3e-bb2a-45f0-893e-0eb33fedb85e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:45:15 np0005546954 nova_compute[187160]: 2025-12-05 12:45:15.188 187164 DEBUG oslo_concurrency.lockutils [None req-2dd415f1-283d-420d-b769-07c0d2a0512f 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "2bf13a3e-bb2a-45f0-893e-0eb33fedb85e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:45:15 np0005546954 nova_compute[187160]: 2025-12-05 12:45:15.190 187164 INFO nova.compute.manager [None req-2dd415f1-283d-420d-b769-07c0d2a0512f 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Terminating instance#033[00m
Dec  5 07:45:15 np0005546954 nova_compute[187160]: 2025-12-05 12:45:15.191 187164 DEBUG nova.compute.manager [None req-2dd415f1-283d-420d-b769-07c0d2a0512f 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:45:15 np0005546954 kernel: tapf9c0965a-86 (unregistering): left promiscuous mode
Dec  5 07:45:15 np0005546954 NetworkManager[55665]: <info>  [1764938715.2160] device (tapf9c0965a-86): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:45:15 np0005546954 ovn_controller[95566]: 2025-12-05T12:45:15Z|00063|binding|INFO|Releasing lport f9c0965a-861e-4c24-9c97-679c6d706267 from this chassis (sb_readonly=0)
Dec  5 07:45:15 np0005546954 nova_compute[187160]: 2025-12-05 12:45:15.219 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:15 np0005546954 ovn_controller[95566]: 2025-12-05T12:45:15Z|00064|binding|INFO|Setting lport f9c0965a-861e-4c24-9c97-679c6d706267 down in Southbound
Dec  5 07:45:15 np0005546954 ovn_controller[95566]: 2025-12-05T12:45:15Z|00065|binding|INFO|Removing iface tapf9c0965a-86 ovn-installed in OVS
Dec  5 07:45:15 np0005546954 nova_compute[187160]: 2025-12-05 12:45:15.223 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:15 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:15.230 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:99:40:2c 10.100.0.12'], port_security=['fa:16:3e:99:40:2c 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '2bf13a3e-bb2a-45f0-893e-0eb33fedb85e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee43e901-b158-4dc0-894f-2384aef8b277', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6b5f383ed0484ca1bde081bf623dad4b', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a865bea6-413e-4ecb-bace-2ec9005935f1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8c84cb49-52df-48a9-8d24-aff5b642e12a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=f9c0965a-861e-4c24-9c97-679c6d706267) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:45:15 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:15.231 104428 INFO neutron.agent.ovn.metadata.agent [-] Port f9c0965a-861e-4c24-9c97-679c6d706267 in datapath ee43e901-b158-4dc0-894f-2384aef8b277 unbound from our chassis#033[00m
Dec  5 07:45:15 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:15.232 104428 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ee43e901-b158-4dc0-894f-2384aef8b277, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:45:15 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:15.233 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[e760ebfa-cfff-4e80-815c-c55b1ce81f0f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:45:15 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:15.234 104428 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277 namespace which is not needed anymore#033[00m
Dec  5 07:45:15 np0005546954 nova_compute[187160]: 2025-12-05 12:45:15.239 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:15 np0005546954 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Deactivated successfully.
Dec  5 07:45:15 np0005546954 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Consumed 18.302s CPU time.
Dec  5 07:45:15 np0005546954 systemd-machined[153497]: Machine qemu-1-instance-00000002 terminated.
Dec  5 07:45:15 np0005546954 neutron-haproxy-ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277[208811]: [NOTICE]   (208817) : haproxy version is 2.8.14-c23fe91
Dec  5 07:45:15 np0005546954 neutron-haproxy-ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277[208811]: [NOTICE]   (208817) : path to executable is /usr/sbin/haproxy
Dec  5 07:45:15 np0005546954 neutron-haproxy-ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277[208811]: [WARNING]  (208817) : Exiting Master process...
Dec  5 07:45:15 np0005546954 neutron-haproxy-ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277[208811]: [ALERT]    (208817) : Current worker (208819) exited with code 143 (Terminated)
Dec  5 07:45:15 np0005546954 neutron-haproxy-ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277[208811]: [WARNING]  (208817) : All workers exited. Exiting... (0)
Dec  5 07:45:15 np0005546954 systemd[1]: libpod-b918bb8b297a48db8d4d39e3d1f420e37096597f6c6c4d3cfaed59f07b0915bc.scope: Deactivated successfully.
Dec  5 07:45:15 np0005546954 podman[209891]: 2025-12-05 12:45:15.387293204 +0000 UTC m=+0.053205634 container died b918bb8b297a48db8d4d39e3d1f420e37096597f6c6c4d3cfaed59f07b0915bc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  5 07:45:15 np0005546954 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b918bb8b297a48db8d4d39e3d1f420e37096597f6c6c4d3cfaed59f07b0915bc-userdata-shm.mount: Deactivated successfully.
Dec  5 07:45:15 np0005546954 systemd[1]: var-lib-containers-storage-overlay-3dddda690aabd64c6917a822df0daded70952bb8bf0dac5f394abe51f6444385-merged.mount: Deactivated successfully.
Dec  5 07:45:15 np0005546954 NetworkManager[55665]: <info>  [1764938715.4250] manager: (tapf9c0965a-86): new Tun device (/org/freedesktop/NetworkManager/Devices/32)
Dec  5 07:45:15 np0005546954 nova_compute[187160]: 2025-12-05 12:45:15.427 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:15 np0005546954 podman[209891]: 2025-12-05 12:45:15.431478275 +0000 UTC m=+0.097390705 container cleanup b918bb8b297a48db8d4d39e3d1f420e37096597f6c6c4d3cfaed59f07b0915bc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec  5 07:45:15 np0005546954 nova_compute[187160]: 2025-12-05 12:45:15.433 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:15 np0005546954 systemd[1]: libpod-conmon-b918bb8b297a48db8d4d39e3d1f420e37096597f6c6c4d3cfaed59f07b0915bc.scope: Deactivated successfully.
Dec  5 07:45:15 np0005546954 nova_compute[187160]: 2025-12-05 12:45:15.464 187164 DEBUG nova.compute.manager [req-347ee3b4-3a30-48b5-9107-f1969d047a6a req-ea8763b4-2f0b-4256-aeeb-24ae953c556b 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Received event network-vif-unplugged-f9c0965a-861e-4c24-9c97-679c6d706267 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:45:15 np0005546954 nova_compute[187160]: 2025-12-05 12:45:15.465 187164 DEBUG oslo_concurrency.lockutils [req-347ee3b4-3a30-48b5-9107-f1969d047a6a req-ea8763b4-2f0b-4256-aeeb-24ae953c556b 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "2bf13a3e-bb2a-45f0-893e-0eb33fedb85e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:45:15 np0005546954 nova_compute[187160]: 2025-12-05 12:45:15.465 187164 DEBUG oslo_concurrency.lockutils [req-347ee3b4-3a30-48b5-9107-f1969d047a6a req-ea8763b4-2f0b-4256-aeeb-24ae953c556b 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "2bf13a3e-bb2a-45f0-893e-0eb33fedb85e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:45:15 np0005546954 nova_compute[187160]: 2025-12-05 12:45:15.466 187164 DEBUG oslo_concurrency.lockutils [req-347ee3b4-3a30-48b5-9107-f1969d047a6a req-ea8763b4-2f0b-4256-aeeb-24ae953c556b 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "2bf13a3e-bb2a-45f0-893e-0eb33fedb85e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:45:15 np0005546954 nova_compute[187160]: 2025-12-05 12:45:15.466 187164 DEBUG nova.compute.manager [req-347ee3b4-3a30-48b5-9107-f1969d047a6a req-ea8763b4-2f0b-4256-aeeb-24ae953c556b 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] No waiting events found dispatching network-vif-unplugged-f9c0965a-861e-4c24-9c97-679c6d706267 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:45:15 np0005546954 nova_compute[187160]: 2025-12-05 12:45:15.466 187164 DEBUG nova.compute.manager [req-347ee3b4-3a30-48b5-9107-f1969d047a6a req-ea8763b4-2f0b-4256-aeeb-24ae953c556b 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Received event network-vif-unplugged-f9c0965a-861e-4c24-9c97-679c6d706267 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  5 07:45:15 np0005546954 nova_compute[187160]: 2025-12-05 12:45:15.470 187164 INFO nova.virt.libvirt.driver [-] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Instance destroyed successfully.#033[00m
Dec  5 07:45:15 np0005546954 nova_compute[187160]: 2025-12-05 12:45:15.470 187164 DEBUG nova.objects.instance [None req-2dd415f1-283d-420d-b769-07c0d2a0512f 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lazy-loading 'resources' on Instance uuid 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:45:15 np0005546954 nova_compute[187160]: 2025-12-05 12:45:15.488 187164 DEBUG nova.virt.libvirt.vif [None req-2dd415f1-283d-420d-b769-07c0d2a0512f 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:42:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-329469785',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-329469785',id=2,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:43:00Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6b5f383ed0484ca1bde081bf623dad4b',ramdisk_id='',reservation_id='r-trugiq9o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1570363089',owner_user_name='tempest-TestExecuteActionsViaActuator-1570363089-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:43:00Z,user_data=None,user_id='7ce7ef64754e4a32b4af3272e31a4a5e',uuid=2bf13a3e-bb2a-45f0-893e-0eb33fedb85e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f9c0965a-861e-4c24-9c97-679c6d706267", "address": "fa:16:3e:99:40:2c", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9c0965a-86", "ovs_interfaceid": "f9c0965a-861e-4c24-9c97-679c6d706267", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:45:15 np0005546954 nova_compute[187160]: 2025-12-05 12:45:15.489 187164 DEBUG nova.network.os_vif_util [None req-2dd415f1-283d-420d-b769-07c0d2a0512f 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Converting VIF {"id": "f9c0965a-861e-4c24-9c97-679c6d706267", "address": "fa:16:3e:99:40:2c", "network": {"id": "ee43e901-b158-4dc0-894f-2384aef8b277", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1864269289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b5f383ed0484ca1bde081bf623dad4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9c0965a-86", "ovs_interfaceid": "f9c0965a-861e-4c24-9c97-679c6d706267", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:45:15 np0005546954 nova_compute[187160]: 2025-12-05 12:45:15.490 187164 DEBUG nova.network.os_vif_util [None req-2dd415f1-283d-420d-b769-07c0d2a0512f 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:99:40:2c,bridge_name='br-int',has_traffic_filtering=True,id=f9c0965a-861e-4c24-9c97-679c6d706267,network=Network(ee43e901-b158-4dc0-894f-2384aef8b277),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf9c0965a-86') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:45:15 np0005546954 nova_compute[187160]: 2025-12-05 12:45:15.490 187164 DEBUG os_vif [None req-2dd415f1-283d-420d-b769-07c0d2a0512f 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:99:40:2c,bridge_name='br-int',has_traffic_filtering=True,id=f9c0965a-861e-4c24-9c97-679c6d706267,network=Network(ee43e901-b158-4dc0-894f-2384aef8b277),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf9c0965a-86') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:45:15 np0005546954 nova_compute[187160]: 2025-12-05 12:45:15.492 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:15 np0005546954 nova_compute[187160]: 2025-12-05 12:45:15.492 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9c0965a-86, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:45:15 np0005546954 nova_compute[187160]: 2025-12-05 12:45:15.493 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:15 np0005546954 nova_compute[187160]: 2025-12-05 12:45:15.495 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:15 np0005546954 nova_compute[187160]: 2025-12-05 12:45:15.499 187164 INFO os_vif [None req-2dd415f1-283d-420d-b769-07c0d2a0512f 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:99:40:2c,bridge_name='br-int',has_traffic_filtering=True,id=f9c0965a-861e-4c24-9c97-679c6d706267,network=Network(ee43e901-b158-4dc0-894f-2384aef8b277),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf9c0965a-86')#033[00m
Dec  5 07:45:15 np0005546954 nova_compute[187160]: 2025-12-05 12:45:15.499 187164 INFO nova.virt.libvirt.driver [None req-2dd415f1-283d-420d-b769-07c0d2a0512f 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Deleting instance files /var/lib/nova/instances/2bf13a3e-bb2a-45f0-893e-0eb33fedb85e_del#033[00m
Dec  5 07:45:15 np0005546954 nova_compute[187160]: 2025-12-05 12:45:15.500 187164 INFO nova.virt.libvirt.driver [None req-2dd415f1-283d-420d-b769-07c0d2a0512f 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Deletion of /var/lib/nova/instances/2bf13a3e-bb2a-45f0-893e-0eb33fedb85e_del complete#033[00m
Dec  5 07:45:15 np0005546954 podman[209927]: 2025-12-05 12:45:15.511463545 +0000 UTC m=+0.048866449 container remove b918bb8b297a48db8d4d39e3d1f420e37096597f6c6c4d3cfaed59f07b0915bc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  5 07:45:15 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:15.519 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[5dc4d35d-b3a2-41b9-9e27-5ebdb30cf523]: (4, ('Fri Dec  5 12:45:15 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277 (b918bb8b297a48db8d4d39e3d1f420e37096597f6c6c4d3cfaed59f07b0915bc)\nb918bb8b297a48db8d4d39e3d1f420e37096597f6c6c4d3cfaed59f07b0915bc\nFri Dec  5 12:45:15 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277 (b918bb8b297a48db8d4d39e3d1f420e37096597f6c6c4d3cfaed59f07b0915bc)\nb918bb8b297a48db8d4d39e3d1f420e37096597f6c6c4d3cfaed59f07b0915bc\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:45:15 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:15.521 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[ef2c18ca-1ae5-4399-bf30-67beb3ad7aa7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:45:15 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:15.522 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee43e901-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:45:15 np0005546954 nova_compute[187160]: 2025-12-05 12:45:15.524 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:15 np0005546954 kernel: tapee43e901-b0: left promiscuous mode
Dec  5 07:45:15 np0005546954 nova_compute[187160]: 2025-12-05 12:45:15.550 187164 INFO nova.compute.manager [None req-2dd415f1-283d-420d-b769-07c0d2a0512f 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:45:15 np0005546954 nova_compute[187160]: 2025-12-05 12:45:15.551 187164 DEBUG oslo.service.loopingcall [None req-2dd415f1-283d-420d-b769-07c0d2a0512f 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:45:15 np0005546954 nova_compute[187160]: 2025-12-05 12:45:15.551 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:15 np0005546954 nova_compute[187160]: 2025-12-05 12:45:15.552 187164 DEBUG nova.compute.manager [-] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:45:15 np0005546954 nova_compute[187160]: 2025-12-05 12:45:15.553 187164 DEBUG nova.network.neutron [-] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:45:15 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:15.553 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[d62565cb-2677-4baf-9d25-a0c8152e5d51]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:45:15 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:15.565 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[692e55f7-3002-4856-9dbe-a9997c78e86e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:45:15 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:15.567 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[b5cdfacd-731a-4ee2-a8ad-67d86eb71657]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:45:15 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:15.579 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2a:56:46', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '32:90:88:ab:74:32'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:45:15 np0005546954 nova_compute[187160]: 2025-12-05 12:45:15.579 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:15 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:15.581 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[6ebd2772-acf5-4cff-a3f7-1691c8d2f981]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 358932, 'reachable_time': 38139, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209945, 'error': None, 'target': 'ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:45:15 np0005546954 systemd[1]: run-netns-ovnmeta\x2dee43e901\x2db158\x2d4dc0\x2d894f\x2d2384aef8b277.mount: Deactivated successfully.
Dec  5 07:45:15 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:15.594 104542 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ee43e901-b158-4dc0-894f-2384aef8b277 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:45:15 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:15.595 104542 DEBUG oslo.privsep.daemon [-] privsep: reply[ce0afd8a-c13c-465d-90e5-4f8fb8b7dfd6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:45:15 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:15.596 104428 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 07:45:15 np0005546954 nova_compute[187160]: 2025-12-05 12:45:15.946 187164 DEBUG nova.network.neutron [-] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:45:15 np0005546954 nova_compute[187160]: 2025-12-05 12:45:15.965 187164 INFO nova.compute.manager [-] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Took 0.41 seconds to deallocate network for instance.#033[00m
Dec  5 07:45:16 np0005546954 nova_compute[187160]: 2025-12-05 12:45:16.005 187164 DEBUG oslo_concurrency.lockutils [None req-2dd415f1-283d-420d-b769-07c0d2a0512f 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:45:16 np0005546954 nova_compute[187160]: 2025-12-05 12:45:16.006 187164 DEBUG oslo_concurrency.lockutils [None req-2dd415f1-283d-420d-b769-07c0d2a0512f 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:45:16 np0005546954 nova_compute[187160]: 2025-12-05 12:45:16.009 187164 DEBUG nova.compute.manager [req-3e21fd6f-f798-4d6e-b94f-ec8ff80cd1c2 req-14f3c15c-cd93-46e2-bdd3-6010a7f310f5 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Received event network-vif-deleted-f9c0965a-861e-4c24-9c97-679c6d706267 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:45:16 np0005546954 nova_compute[187160]: 2025-12-05 12:45:16.057 187164 DEBUG nova.compute.provider_tree [None req-2dd415f1-283d-420d-b769-07c0d2a0512f 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:45:16 np0005546954 nova_compute[187160]: 2025-12-05 12:45:16.069 187164 DEBUG nova.scheduler.client.report [None req-2dd415f1-283d-420d-b769-07c0d2a0512f 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:45:16 np0005546954 nova_compute[187160]: 2025-12-05 12:45:16.088 187164 DEBUG oslo_concurrency.lockutils [None req-2dd415f1-283d-420d-b769-07c0d2a0512f 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.082s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:45:16 np0005546954 nova_compute[187160]: 2025-12-05 12:45:16.118 187164 INFO nova.scheduler.client.report [None req-2dd415f1-283d-420d-b769-07c0d2a0512f 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Deleted allocations for instance 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e#033[00m
Dec  5 07:45:16 np0005546954 nova_compute[187160]: 2025-12-05 12:45:16.198 187164 DEBUG oslo_concurrency.lockutils [None req-2dd415f1-283d-420d-b769-07c0d2a0512f 7ce7ef64754e4a32b4af3272e31a4a5e 6b5f383ed0484ca1bde081bf623dad4b - - default default] Lock "2bf13a3e-bb2a-45f0-893e-0eb33fedb85e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.011s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:45:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:16.942 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:45:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:16.942 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:45:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:16.943 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:45:17 np0005546954 nova_compute[187160]: 2025-12-05 12:45:17.557 187164 DEBUG nova.compute.manager [req-d57f9141-232d-41b6-bc8e-8a1407aa3950 req-ac0fb12f-c61b-4566-85b6-cc663898761a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Received event network-vif-plugged-f9c0965a-861e-4c24-9c97-679c6d706267 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:45:17 np0005546954 nova_compute[187160]: 2025-12-05 12:45:17.558 187164 DEBUG oslo_concurrency.lockutils [req-d57f9141-232d-41b6-bc8e-8a1407aa3950 req-ac0fb12f-c61b-4566-85b6-cc663898761a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "2bf13a3e-bb2a-45f0-893e-0eb33fedb85e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:45:17 np0005546954 nova_compute[187160]: 2025-12-05 12:45:17.558 187164 DEBUG oslo_concurrency.lockutils [req-d57f9141-232d-41b6-bc8e-8a1407aa3950 req-ac0fb12f-c61b-4566-85b6-cc663898761a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "2bf13a3e-bb2a-45f0-893e-0eb33fedb85e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:45:17 np0005546954 nova_compute[187160]: 2025-12-05 12:45:17.558 187164 DEBUG oslo_concurrency.lockutils [req-d57f9141-232d-41b6-bc8e-8a1407aa3950 req-ac0fb12f-c61b-4566-85b6-cc663898761a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "2bf13a3e-bb2a-45f0-893e-0eb33fedb85e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:45:17 np0005546954 nova_compute[187160]: 2025-12-05 12:45:17.559 187164 DEBUG nova.compute.manager [req-d57f9141-232d-41b6-bc8e-8a1407aa3950 req-ac0fb12f-c61b-4566-85b6-cc663898761a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] No waiting events found dispatching network-vif-plugged-f9c0965a-861e-4c24-9c97-679c6d706267 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:45:17 np0005546954 nova_compute[187160]: 2025-12-05 12:45:17.559 187164 WARNING nova.compute.manager [req-d57f9141-232d-41b6-bc8e-8a1407aa3950 req-ac0fb12f-c61b-4566-85b6-cc663898761a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Received unexpected event network-vif-plugged-f9c0965a-861e-4c24-9c97-679c6d706267 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:45:19 np0005546954 nova_compute[187160]: 2025-12-05 12:45:19.067 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:45:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:45:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:45:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 07:45:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:45:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:45:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:45:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 07:45:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:45:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:45:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 07:45:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:45:19 np0005546954 nova_compute[187160]: 2025-12-05 12:45:19.566 187164 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764938704.5642042, a133dad5-02c4-4021-90e5-ee9f3322f351 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:45:19 np0005546954 nova_compute[187160]: 2025-12-05 12:45:19.566 187164 INFO nova.compute.manager [-] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:45:19 np0005546954 nova_compute[187160]: 2025-12-05 12:45:19.587 187164 DEBUG nova.compute.manager [None req-6d5fa1d8-62a3-4c99-90d9-6280a0e47a2e - - - - - -] [instance: a133dad5-02c4-4021-90e5-ee9f3322f351] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:45:20 np0005546954 nova_compute[187160]: 2025-12-05 12:45:20.495 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:22 np0005546954 nova_compute[187160]: 2025-12-05 12:45:22.626 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:45:23.599 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f9f74c-08f9-451f-9678-93bb9e8fa2fe, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:45:24 np0005546954 nova_compute[187160]: 2025-12-05 12:45:24.062 187164 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764938709.0601888, bd45ae9f-9649-4347-a5e4-658d02804ef9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:45:24 np0005546954 nova_compute[187160]: 2025-12-05 12:45:24.063 187164 INFO nova.compute.manager [-] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:45:24 np0005546954 nova_compute[187160]: 2025-12-05 12:45:24.069 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:24 np0005546954 nova_compute[187160]: 2025-12-05 12:45:24.232 187164 DEBUG nova.compute.manager [None req-a8087657-8a91-41ba-ba56-83264a71d44a - - - - - -] [instance: bd45ae9f-9649-4347-a5e4-658d02804ef9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:45:25 np0005546954 nova_compute[187160]: 2025-12-05 12:45:25.498 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:25 np0005546954 podman[209948]: 2025-12-05 12:45:25.604467015 +0000 UTC m=+0.100724199 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec  5 07:45:25 np0005546954 podman[209947]: 2025-12-05 12:45:25.615709946 +0000 UTC m=+0.113863600 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, version=9.6, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.openshift.expose-services=)
Dec  5 07:45:25 np0005546954 nova_compute[187160]: 2025-12-05 12:45:25.797 187164 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764938710.7964988, 0513a02c-7fe2-43aa-9bd6-020014460672 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:45:25 np0005546954 nova_compute[187160]: 2025-12-05 12:45:25.797 187164 INFO nova.compute.manager [-] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:45:25 np0005546954 nova_compute[187160]: 2025-12-05 12:45:25.820 187164 DEBUG nova.compute.manager [None req-57ee3ced-c736-4a1c-99ea-fa7df11e0102 - - - - - -] [instance: 0513a02c-7fe2-43aa-9bd6-020014460672] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:45:28 np0005546954 nova_compute[187160]: 2025-12-05 12:45:28.043 187164 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764938713.0423315, dde775e5-862e-4f88-b0e9-7d98a681bb3e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:45:28 np0005546954 nova_compute[187160]: 2025-12-05 12:45:28.044 187164 INFO nova.compute.manager [-] [instance: dde775e5-862e-4f88-b0e9-7d98a681bb3e] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:45:28 np0005546954 nova_compute[187160]: 2025-12-05 12:45:28.064 187164 DEBUG nova.compute.manager [None req-f85abd63-2434-466c-903b-8639f851d24d - - - - - -] [instance: dde775e5-862e-4f88-b0e9-7d98a681bb3e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:45:29 np0005546954 nova_compute[187160]: 2025-12-05 12:45:29.072 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:30 np0005546954 nova_compute[187160]: 2025-12-05 12:45:30.468 187164 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764938715.4643612, 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:45:30 np0005546954 nova_compute[187160]: 2025-12-05 12:45:30.468 187164 INFO nova.compute.manager [-] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:45:30 np0005546954 nova_compute[187160]: 2025-12-05 12:45:30.490 187164 DEBUG nova.compute.manager [None req-6ca4a300-97c8-4f25-8c90-5184a9021e3e - - - - - -] [instance: 2bf13a3e-bb2a-45f0-893e-0eb33fedb85e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:45:30 np0005546954 nova_compute[187160]: 2025-12-05 12:45:30.503 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:34 np0005546954 nova_compute[187160]: 2025-12-05 12:45:34.075 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:35 np0005546954 nova_compute[187160]: 2025-12-05 12:45:35.538 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:35 np0005546954 podman[197513]: time="2025-12-05T12:45:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:45:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:45:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 07:45:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:45:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2580 "" "Go-http-client/1.1"
Dec  5 07:45:38 np0005546954 podman[209986]: 2025-12-05 12:45:38.550355895 +0000 UTC m=+0.063733943 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Dec  5 07:45:39 np0005546954 nova_compute[187160]: 2025-12-05 12:45:39.077 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:40 np0005546954 nova_compute[187160]: 2025-12-05 12:45:40.541 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:43 np0005546954 podman[210006]: 2025-12-05 12:45:43.585553186 +0000 UTC m=+0.097115786 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:45:43 np0005546954 podman[210007]: 2025-12-05 12:45:43.613705736 +0000 UTC m=+0.111358042 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 07:45:44 np0005546954 nova_compute[187160]: 2025-12-05 12:45:44.091 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:45 np0005546954 nova_compute[187160]: 2025-12-05 12:45:45.543 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:49 np0005546954 nova_compute[187160]: 2025-12-05 12:45:49.134 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:45:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 07:45:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:45:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:45:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:45:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:45:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:45:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 07:45:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:45:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:45:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 07:45:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:45:50 np0005546954 nova_compute[187160]: 2025-12-05 12:45:50.546 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:51 np0005546954 nova_compute[187160]: 2025-12-05 12:45:51.108 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:45:52 np0005546954 nova_compute[187160]: 2025-12-05 12:45:52.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:45:52 np0005546954 nova_compute[187160]: 2025-12-05 12:45:52.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:45:52 np0005546954 nova_compute[187160]: 2025-12-05 12:45:52.095 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 07:45:54 np0005546954 nova_compute[187160]: 2025-12-05 12:45:54.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:45:54 np0005546954 nova_compute[187160]: 2025-12-05 12:45:54.136 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:55 np0005546954 nova_compute[187160]: 2025-12-05 12:45:55.549 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:56 np0005546954 nova_compute[187160]: 2025-12-05 12:45:56.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:45:56 np0005546954 nova_compute[187160]: 2025-12-05 12:45:56.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:45:56 np0005546954 nova_compute[187160]: 2025-12-05 12:45:56.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:45:56 np0005546954 podman[210054]: 2025-12-05 12:45:56.566381449 +0000 UTC m=+0.060167682 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:45:56 np0005546954 podman[210053]: 2025-12-05 12:45:56.582925996 +0000 UTC m=+0.090928923 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., architecture=x86_64, version=9.6, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vcs-type=git, config_id=edpm, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal)
Dec  5 07:45:57 np0005546954 nova_compute[187160]: 2025-12-05 12:45:57.034 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:45:57 np0005546954 nova_compute[187160]: 2025-12-05 12:45:57.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:45:57 np0005546954 nova_compute[187160]: 2025-12-05 12:45:57.063 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:45:57 np0005546954 nova_compute[187160]: 2025-12-05 12:45:57.064 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:45:57 np0005546954 nova_compute[187160]: 2025-12-05 12:45:57.064 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:45:57 np0005546954 nova_compute[187160]: 2025-12-05 12:45:57.065 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:45:57 np0005546954 nova_compute[187160]: 2025-12-05 12:45:57.250 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:45:57 np0005546954 nova_compute[187160]: 2025-12-05 12:45:57.251 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5878MB free_disk=73.34025573730469GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:45:57 np0005546954 nova_compute[187160]: 2025-12-05 12:45:57.251 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:45:57 np0005546954 nova_compute[187160]: 2025-12-05 12:45:57.252 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:45:57 np0005546954 nova_compute[187160]: 2025-12-05 12:45:57.306 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:45:57 np0005546954 nova_compute[187160]: 2025-12-05 12:45:57.306 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:45:57 np0005546954 nova_compute[187160]: 2025-12-05 12:45:57.325 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:45:57 np0005546954 nova_compute[187160]: 2025-12-05 12:45:57.351 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:45:57 np0005546954 nova_compute[187160]: 2025-12-05 12:45:57.379 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:45:57 np0005546954 nova_compute[187160]: 2025-12-05 12:45:57.380 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:45:58 np0005546954 ovn_controller[95566]: 2025-12-05T12:45:58Z|00066|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Dec  5 07:45:59 np0005546954 nova_compute[187160]: 2025-12-05 12:45:59.140 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:45:59 np0005546954 nova_compute[187160]: 2025-12-05 12:45:59.381 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:46:00 np0005546954 nova_compute[187160]: 2025-12-05 12:46:00.552 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:46:02 np0005546954 nova_compute[187160]: 2025-12-05 12:46:02.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:46:04 np0005546954 nova_compute[187160]: 2025-12-05 12:46:04.195 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:46:05 np0005546954 nova_compute[187160]: 2025-12-05 12:46:05.555 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:46:05 np0005546954 podman[197513]: time="2025-12-05T12:46:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:46:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:46:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 07:46:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:46:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2584 "" "Go-http-client/1.1"
Dec  5 07:46:06 np0005546954 nova_compute[187160]: 2025-12-05 12:46:06.730 187164 DEBUG oslo_concurrency.lockutils [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Acquiring lock "c0f94a57-0be2-40c1-a4a8-5e04bbbb608d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:46:06 np0005546954 nova_compute[187160]: 2025-12-05 12:46:06.730 187164 DEBUG oslo_concurrency.lockutils [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Lock "c0f94a57-0be2-40c1-a4a8-5e04bbbb608d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:46:06 np0005546954 nova_compute[187160]: 2025-12-05 12:46:06.751 187164 DEBUG nova.compute.manager [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:46:06 np0005546954 nova_compute[187160]: 2025-12-05 12:46:06.835 187164 DEBUG oslo_concurrency.lockutils [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:46:06 np0005546954 nova_compute[187160]: 2025-12-05 12:46:06.836 187164 DEBUG oslo_concurrency.lockutils [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:46:06 np0005546954 nova_compute[187160]: 2025-12-05 12:46:06.843 187164 DEBUG nova.virt.hardware [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:46:06 np0005546954 nova_compute[187160]: 2025-12-05 12:46:06.844 187164 INFO nova.compute.claims [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Claim successful on node compute-1.ctlplane.example.com#033[00m
Dec  5 07:46:06 np0005546954 nova_compute[187160]: 2025-12-05 12:46:06.983 187164 DEBUG nova.compute.provider_tree [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:46:07 np0005546954 nova_compute[187160]: 2025-12-05 12:46:07.003 187164 DEBUG nova.scheduler.client.report [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:46:07 np0005546954 nova_compute[187160]: 2025-12-05 12:46:07.026 187164 DEBUG oslo_concurrency.lockutils [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:46:07 np0005546954 nova_compute[187160]: 2025-12-05 12:46:07.027 187164 DEBUG nova.compute.manager [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:46:07 np0005546954 nova_compute[187160]: 2025-12-05 12:46:07.071 187164 DEBUG nova.compute.manager [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:46:07 np0005546954 nova_compute[187160]: 2025-12-05 12:46:07.072 187164 DEBUG nova.network.neutron [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:46:07 np0005546954 nova_compute[187160]: 2025-12-05 12:46:07.097 187164 INFO nova.virt.libvirt.driver [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:46:07 np0005546954 nova_compute[187160]: 2025-12-05 12:46:07.128 187164 DEBUG nova.compute.manager [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:46:07 np0005546954 nova_compute[187160]: 2025-12-05 12:46:07.231 187164 DEBUG nova.compute.manager [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:46:07 np0005546954 nova_compute[187160]: 2025-12-05 12:46:07.233 187164 DEBUG nova.virt.libvirt.driver [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:46:07 np0005546954 nova_compute[187160]: 2025-12-05 12:46:07.233 187164 INFO nova.virt.libvirt.driver [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Creating image(s)#033[00m
Dec  5 07:46:07 np0005546954 nova_compute[187160]: 2025-12-05 12:46:07.234 187164 DEBUG oslo_concurrency.lockutils [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Acquiring lock "/var/lib/nova/instances/c0f94a57-0be2-40c1-a4a8-5e04bbbb608d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:46:07 np0005546954 nova_compute[187160]: 2025-12-05 12:46:07.234 187164 DEBUG oslo_concurrency.lockutils [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Lock "/var/lib/nova/instances/c0f94a57-0be2-40c1-a4a8-5e04bbbb608d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:46:07 np0005546954 nova_compute[187160]: 2025-12-05 12:46:07.235 187164 DEBUG oslo_concurrency.lockutils [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Lock "/var/lib/nova/instances/c0f94a57-0be2-40c1-a4a8-5e04bbbb608d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:46:07 np0005546954 nova_compute[187160]: 2025-12-05 12:46:07.250 187164 DEBUG oslo_concurrency.processutils [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:46:07 np0005546954 nova_compute[187160]: 2025-12-05 12:46:07.276 187164 DEBUG nova.policy [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a0f9474025d345019200ba286c9b5bf1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8f3230f111af4a7b989f52dd95d9d57f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:46:07 np0005546954 nova_compute[187160]: 2025-12-05 12:46:07.335 187164 DEBUG oslo_concurrency.processutils [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:46:07 np0005546954 nova_compute[187160]: 2025-12-05 12:46:07.337 187164 DEBUG oslo_concurrency.lockutils [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Acquiring lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:46:07 np0005546954 nova_compute[187160]: 2025-12-05 12:46:07.338 187164 DEBUG oslo_concurrency.lockutils [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:46:07 np0005546954 nova_compute[187160]: 2025-12-05 12:46:07.353 187164 DEBUG oslo_concurrency.processutils [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:46:07 np0005546954 nova_compute[187160]: 2025-12-05 12:46:07.414 187164 DEBUG oslo_concurrency.processutils [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:46:07 np0005546954 nova_compute[187160]: 2025-12-05 12:46:07.416 187164 DEBUG oslo_concurrency.processutils [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/c0f94a57-0be2-40c1-a4a8-5e04bbbb608d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:46:07 np0005546954 nova_compute[187160]: 2025-12-05 12:46:07.455 187164 DEBUG oslo_concurrency.processutils [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/c0f94a57-0be2-40c1-a4a8-5e04bbbb608d/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:46:07 np0005546954 nova_compute[187160]: 2025-12-05 12:46:07.457 187164 DEBUG oslo_concurrency.lockutils [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:46:07 np0005546954 nova_compute[187160]: 2025-12-05 12:46:07.458 187164 DEBUG oslo_concurrency.processutils [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:46:07 np0005546954 nova_compute[187160]: 2025-12-05 12:46:07.545 187164 DEBUG oslo_concurrency.processutils [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:46:07 np0005546954 nova_compute[187160]: 2025-12-05 12:46:07.546 187164 DEBUG nova.virt.disk.api [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Checking if we can resize image /var/lib/nova/instances/c0f94a57-0be2-40c1-a4a8-5e04bbbb608d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:46:07 np0005546954 nova_compute[187160]: 2025-12-05 12:46:07.546 187164 DEBUG oslo_concurrency.processutils [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0f94a57-0be2-40c1-a4a8-5e04bbbb608d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:46:07 np0005546954 nova_compute[187160]: 2025-12-05 12:46:07.619 187164 DEBUG oslo_concurrency.processutils [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0f94a57-0be2-40c1-a4a8-5e04bbbb608d/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:46:07 np0005546954 nova_compute[187160]: 2025-12-05 12:46:07.620 187164 DEBUG nova.virt.disk.api [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Cannot resize image /var/lib/nova/instances/c0f94a57-0be2-40c1-a4a8-5e04bbbb608d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:46:07 np0005546954 nova_compute[187160]: 2025-12-05 12:46:07.620 187164 DEBUG nova.objects.instance [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Lazy-loading 'migration_context' on Instance uuid c0f94a57-0be2-40c1-a4a8-5e04bbbb608d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:46:07 np0005546954 nova_compute[187160]: 2025-12-05 12:46:07.638 187164 DEBUG nova.virt.libvirt.driver [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:46:07 np0005546954 nova_compute[187160]: 2025-12-05 12:46:07.639 187164 DEBUG nova.virt.libvirt.driver [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Ensure instance console log exists: /var/lib/nova/instances/c0f94a57-0be2-40c1-a4a8-5e04bbbb608d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:46:07 np0005546954 nova_compute[187160]: 2025-12-05 12:46:07.640 187164 DEBUG oslo_concurrency.lockutils [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:46:07 np0005546954 nova_compute[187160]: 2025-12-05 12:46:07.640 187164 DEBUG oslo_concurrency.lockutils [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:46:07 np0005546954 nova_compute[187160]: 2025-12-05 12:46:07.640 187164 DEBUG oslo_concurrency.lockutils [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:46:09 np0005546954 nova_compute[187160]: 2025-12-05 12:46:09.198 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:46:09 np0005546954 podman[210111]: 2025-12-05 12:46:09.566631687 +0000 UTC m=+0.081660043 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:46:10 np0005546954 nova_compute[187160]: 2025-12-05 12:46:10.558 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:46:10 np0005546954 nova_compute[187160]: 2025-12-05 12:46:10.970 187164 DEBUG nova.network.neutron [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Successfully created port: ebe57519-501d-4a2e-be01-a8c24ea4af8f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:46:13 np0005546954 nova_compute[187160]: 2025-12-05 12:46:13.364 187164 DEBUG nova.network.neutron [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Successfully updated port: ebe57519-501d-4a2e-be01-a8c24ea4af8f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:46:13 np0005546954 nova_compute[187160]: 2025-12-05 12:46:13.387 187164 DEBUG oslo_concurrency.lockutils [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Acquiring lock "refresh_cache-c0f94a57-0be2-40c1-a4a8-5e04bbbb608d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:46:13 np0005546954 nova_compute[187160]: 2025-12-05 12:46:13.387 187164 DEBUG oslo_concurrency.lockutils [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Acquired lock "refresh_cache-c0f94a57-0be2-40c1-a4a8-5e04bbbb608d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:46:13 np0005546954 nova_compute[187160]: 2025-12-05 12:46:13.388 187164 DEBUG nova.network.neutron [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:46:13 np0005546954 nova_compute[187160]: 2025-12-05 12:46:13.468 187164 DEBUG nova.compute.manager [req-4f03f3a1-904e-4a48-baa7-99fda04a527f req-a731d0ba-fc65-4916-a7da-0eb26ccca81c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Received event network-changed-ebe57519-501d-4a2e-be01-a8c24ea4af8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:46:13 np0005546954 nova_compute[187160]: 2025-12-05 12:46:13.469 187164 DEBUG nova.compute.manager [req-4f03f3a1-904e-4a48-baa7-99fda04a527f req-a731d0ba-fc65-4916-a7da-0eb26ccca81c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Refreshing instance network info cache due to event network-changed-ebe57519-501d-4a2e-be01-a8c24ea4af8f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:46:13 np0005546954 nova_compute[187160]: 2025-12-05 12:46:13.469 187164 DEBUG oslo_concurrency.lockutils [req-4f03f3a1-904e-4a48-baa7-99fda04a527f req-a731d0ba-fc65-4916-a7da-0eb26ccca81c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "refresh_cache-c0f94a57-0be2-40c1-a4a8-5e04bbbb608d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:46:14 np0005546954 nova_compute[187160]: 2025-12-05 12:46:14.200 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:46:14 np0005546954 nova_compute[187160]: 2025-12-05 12:46:14.363 187164 DEBUG nova.network.neutron [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:46:14 np0005546954 podman[210131]: 2025-12-05 12:46:14.548590384 +0000 UTC m=+0.055580839 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 07:46:14 np0005546954 podman[210130]: 2025-12-05 12:46:14.599287568 +0000 UTC m=+0.104696624 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  5 07:46:15 np0005546954 nova_compute[187160]: 2025-12-05 12:46:15.609 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:46:15 np0005546954 nova_compute[187160]: 2025-12-05 12:46:15.803 187164 DEBUG nova.network.neutron [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Updating instance_info_cache with network_info: [{"id": "ebe57519-501d-4a2e-be01-a8c24ea4af8f", "address": "fa:16:3e:4c:86:d3", "network": {"id": "2440340b-a243-4eda-a15d-71178a169a03", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1967301454-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f3230f111af4a7b989f52dd95d9d57f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebe57519-50", "ovs_interfaceid": "ebe57519-501d-4a2e-be01-a8c24ea4af8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:46:15 np0005546954 nova_compute[187160]: 2025-12-05 12:46:15.823 187164 DEBUG oslo_concurrency.lockutils [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Releasing lock "refresh_cache-c0f94a57-0be2-40c1-a4a8-5e04bbbb608d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:46:15 np0005546954 nova_compute[187160]: 2025-12-05 12:46:15.823 187164 DEBUG nova.compute.manager [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Instance network_info: |[{"id": "ebe57519-501d-4a2e-be01-a8c24ea4af8f", "address": "fa:16:3e:4c:86:d3", "network": {"id": "2440340b-a243-4eda-a15d-71178a169a03", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1967301454-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f3230f111af4a7b989f52dd95d9d57f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebe57519-50", "ovs_interfaceid": "ebe57519-501d-4a2e-be01-a8c24ea4af8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:46:15 np0005546954 nova_compute[187160]: 2025-12-05 12:46:15.824 187164 DEBUG oslo_concurrency.lockutils [req-4f03f3a1-904e-4a48-baa7-99fda04a527f req-a731d0ba-fc65-4916-a7da-0eb26ccca81c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquired lock "refresh_cache-c0f94a57-0be2-40c1-a4a8-5e04bbbb608d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:46:15 np0005546954 nova_compute[187160]: 2025-12-05 12:46:15.824 187164 DEBUG nova.network.neutron [req-4f03f3a1-904e-4a48-baa7-99fda04a527f req-a731d0ba-fc65-4916-a7da-0eb26ccca81c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Refreshing network info cache for port ebe57519-501d-4a2e-be01-a8c24ea4af8f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:46:15 np0005546954 nova_compute[187160]: 2025-12-05 12:46:15.828 187164 DEBUG nova.virt.libvirt.driver [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Start _get_guest_xml network_info=[{"id": "ebe57519-501d-4a2e-be01-a8c24ea4af8f", "address": "fa:16:3e:4c:86:d3", "network": {"id": "2440340b-a243-4eda-a15d-71178a169a03", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1967301454-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f3230f111af4a7b989f52dd95d9d57f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebe57519-50", "ovs_interfaceid": "ebe57519-501d-4a2e-be01-a8c24ea4af8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T12:39:17Z,direct_url=<?>,disk_format='qcow2',id=f4c3125a-6fd0-40bb-aa00-a7e736ee853d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='83916c53de6f404f91206339303e1b23',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T12:39:21Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'size': 0, 'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'encrypted': False, 'image_id': 'f4c3125a-6fd0-40bb-aa00-a7e736ee853d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:46:15 np0005546954 nova_compute[187160]: 2025-12-05 12:46:15.834 187164 WARNING nova.virt.libvirt.driver [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:46:15 np0005546954 nova_compute[187160]: 2025-12-05 12:46:15.843 187164 DEBUG nova.virt.libvirt.host [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:46:15 np0005546954 nova_compute[187160]: 2025-12-05 12:46:15.844 187164 DEBUG nova.virt.libvirt.host [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:46:15 np0005546954 nova_compute[187160]: 2025-12-05 12:46:15.850 187164 DEBUG nova.virt.libvirt.host [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:46:15 np0005546954 nova_compute[187160]: 2025-12-05 12:46:15.851 187164 DEBUG nova.virt.libvirt.host [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:46:15 np0005546954 nova_compute[187160]: 2025-12-05 12:46:15.852 187164 DEBUG nova.virt.libvirt.driver [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:46:15 np0005546954 nova_compute[187160]: 2025-12-05 12:46:15.853 187164 DEBUG nova.virt.hardware [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T12:39:15Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b4ea63be-97f8-4a48-b000-66321c4ddb27',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T12:39:17Z,direct_url=<?>,disk_format='qcow2',id=f4c3125a-6fd0-40bb-aa00-a7e736ee853d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='83916c53de6f404f91206339303e1b23',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T12:39:21Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:46:15 np0005546954 nova_compute[187160]: 2025-12-05 12:46:15.853 187164 DEBUG nova.virt.hardware [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:46:15 np0005546954 nova_compute[187160]: 2025-12-05 12:46:15.853 187164 DEBUG nova.virt.hardware [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:46:15 np0005546954 nova_compute[187160]: 2025-12-05 12:46:15.853 187164 DEBUG nova.virt.hardware [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:46:15 np0005546954 nova_compute[187160]: 2025-12-05 12:46:15.854 187164 DEBUG nova.virt.hardware [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:46:15 np0005546954 nova_compute[187160]: 2025-12-05 12:46:15.854 187164 DEBUG nova.virt.hardware [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:46:15 np0005546954 nova_compute[187160]: 2025-12-05 12:46:15.854 187164 DEBUG nova.virt.hardware [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:46:15 np0005546954 nova_compute[187160]: 2025-12-05 12:46:15.854 187164 DEBUG nova.virt.hardware [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:46:15 np0005546954 nova_compute[187160]: 2025-12-05 12:46:15.855 187164 DEBUG nova.virt.hardware [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:46:15 np0005546954 nova_compute[187160]: 2025-12-05 12:46:15.855 187164 DEBUG nova.virt.hardware [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:46:15 np0005546954 nova_compute[187160]: 2025-12-05 12:46:15.855 187164 DEBUG nova.virt.hardware [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:46:15 np0005546954 nova_compute[187160]: 2025-12-05 12:46:15.859 187164 DEBUG nova.virt.libvirt.vif [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:46:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-92849756',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-92849756',id=8,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8f3230f111af4a7b989f52dd95d9d57f',ramdisk_id='',reservation_id='r-495evnna',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-2049426111',owner_user_name='tempest-TestExecuteBasicStrategy-2049426111-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:46:07Z,user_data=None,user_id='a0f9474025d345019200ba286c9b5bf1',uuid=c0f94a57-0be2-40c1-a4a8-5e04bbbb608d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ebe57519-501d-4a2e-be01-a8c24ea4af8f", "address": "fa:16:3e:4c:86:d3", "network": {"id": "2440340b-a243-4eda-a15d-71178a169a03", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1967301454-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f3230f111af4a7b989f52dd95d9d57f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebe57519-50", "ovs_interfaceid": "ebe57519-501d-4a2e-be01-a8c24ea4af8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:46:15 np0005546954 nova_compute[187160]: 2025-12-05 12:46:15.859 187164 DEBUG nova.network.os_vif_util [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Converting VIF {"id": "ebe57519-501d-4a2e-be01-a8c24ea4af8f", "address": "fa:16:3e:4c:86:d3", "network": {"id": "2440340b-a243-4eda-a15d-71178a169a03", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1967301454-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f3230f111af4a7b989f52dd95d9d57f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebe57519-50", "ovs_interfaceid": "ebe57519-501d-4a2e-be01-a8c24ea4af8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:46:15 np0005546954 nova_compute[187160]: 2025-12-05 12:46:15.860 187164 DEBUG nova.network.os_vif_util [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4c:86:d3,bridge_name='br-int',has_traffic_filtering=True,id=ebe57519-501d-4a2e-be01-a8c24ea4af8f,network=Network(2440340b-a243-4eda-a15d-71178a169a03),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebe57519-50') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:46:15 np0005546954 nova_compute[187160]: 2025-12-05 12:46:15.861 187164 DEBUG nova.objects.instance [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Lazy-loading 'pci_devices' on Instance uuid c0f94a57-0be2-40c1-a4a8-5e04bbbb608d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:46:15 np0005546954 nova_compute[187160]: 2025-12-05 12:46:15.876 187164 DEBUG nova.virt.libvirt.driver [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:46:15 np0005546954 nova_compute[187160]:  <uuid>c0f94a57-0be2-40c1-a4a8-5e04bbbb608d</uuid>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:  <name>instance-00000008</name>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:  <memory>131072</memory>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:  <vcpu>1</vcpu>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:  <metadata>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:46:15 np0005546954 nova_compute[187160]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:      <nova:name>tempest-TestExecuteBasicStrategy-server-92849756</nova:name>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:      <nova:creationTime>2025-12-05 12:46:15</nova:creationTime>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:      <nova:flavor name="m1.nano">
Dec  5 07:46:15 np0005546954 nova_compute[187160]:        <nova:memory>128</nova:memory>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:        <nova:disk>1</nova:disk>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:        <nova:swap>0</nova:swap>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:      </nova:flavor>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:      <nova:owner>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:        <nova:user uuid="a0f9474025d345019200ba286c9b5bf1">tempest-TestExecuteBasicStrategy-2049426111-project-member</nova:user>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:        <nova:project uuid="8f3230f111af4a7b989f52dd95d9d57f">tempest-TestExecuteBasicStrategy-2049426111</nova:project>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:      </nova:owner>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:      <nova:root type="image" uuid="f4c3125a-6fd0-40bb-aa00-a7e736ee853d"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:      <nova:ports>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:        <nova:port uuid="ebe57519-501d-4a2e-be01-a8c24ea4af8f">
Dec  5 07:46:15 np0005546954 nova_compute[187160]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:        </nova:port>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:      </nova:ports>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    </nova:instance>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:  </metadata>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:  <sysinfo type="smbios">
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <system>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:      <entry name="serial">c0f94a57-0be2-40c1-a4a8-5e04bbbb608d</entry>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:      <entry name="uuid">c0f94a57-0be2-40c1-a4a8-5e04bbbb608d</entry>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    </system>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:  </sysinfo>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:  <os>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <boot dev="hd"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <smbios mode="sysinfo"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:  </os>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:  <features>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <acpi/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <apic/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <vmcoreinfo/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:  </features>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:  <clock offset="utc">
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <timer name="hpet" present="no"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:  </clock>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:  <cpu mode="custom" match="exact">
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <model>Nehalem</model>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:  </cpu>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:  <devices>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <disk type="file" device="disk">
Dec  5 07:46:15 np0005546954 nova_compute[187160]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:      <source file="/var/lib/nova/instances/c0f94a57-0be2-40c1-a4a8-5e04bbbb608d/disk"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:      <target dev="vda" bus="virtio"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    </disk>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <disk type="file" device="cdrom">
Dec  5 07:46:15 np0005546954 nova_compute[187160]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:      <source file="/var/lib/nova/instances/c0f94a57-0be2-40c1-a4a8-5e04bbbb608d/disk.config"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:      <target dev="sda" bus="sata"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    </disk>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <interface type="ethernet">
Dec  5 07:46:15 np0005546954 nova_compute[187160]:      <mac address="fa:16:3e:4c:86:d3"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:      <model type="virtio"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:      <mtu size="1442"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:      <target dev="tapebe57519-50"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    </interface>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <serial type="pty">
Dec  5 07:46:15 np0005546954 nova_compute[187160]:      <log file="/var/lib/nova/instances/c0f94a57-0be2-40c1-a4a8-5e04bbbb608d/console.log" append="off"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    </serial>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <video>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:      <model type="virtio"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    </video>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <input type="tablet" bus="usb"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <rng model="virtio">
Dec  5 07:46:15 np0005546954 nova_compute[187160]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    </rng>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <controller type="usb" index="0"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    <memballoon model="virtio">
Dec  5 07:46:15 np0005546954 nova_compute[187160]:      <stats period="10"/>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:    </memballoon>
Dec  5 07:46:15 np0005546954 nova_compute[187160]:  </devices>
Dec  5 07:46:15 np0005546954 nova_compute[187160]: </domain>
Dec  5 07:46:15 np0005546954 nova_compute[187160]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:46:15 np0005546954 nova_compute[187160]: 2025-12-05 12:46:15.878 187164 DEBUG nova.compute.manager [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Preparing to wait for external event network-vif-plugged-ebe57519-501d-4a2e-be01-a8c24ea4af8f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:46:15 np0005546954 nova_compute[187160]: 2025-12-05 12:46:15.878 187164 DEBUG oslo_concurrency.lockutils [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Acquiring lock "c0f94a57-0be2-40c1-a4a8-5e04bbbb608d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:46:15 np0005546954 nova_compute[187160]: 2025-12-05 12:46:15.878 187164 DEBUG oslo_concurrency.lockutils [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Lock "c0f94a57-0be2-40c1-a4a8-5e04bbbb608d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:46:15 np0005546954 nova_compute[187160]: 2025-12-05 12:46:15.878 187164 DEBUG oslo_concurrency.lockutils [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Lock "c0f94a57-0be2-40c1-a4a8-5e04bbbb608d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:46:15 np0005546954 nova_compute[187160]: 2025-12-05 12:46:15.879 187164 DEBUG nova.virt.libvirt.vif [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:46:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-92849756',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-92849756',id=8,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8f3230f111af4a7b989f52dd95d9d57f',ramdisk_id='',reservation_id='r-495evnna',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-2049426111',owner_user_name='tempest-TestExecuteBasicStrategy-2049426111-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:46:07Z,user_data=None,user_id='a0f9474025d345019200ba286c9b5bf1',uuid=c0f94a57-0be2-40c1-a4a8-5e04bbbb608d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ebe57519-501d-4a2e-be01-a8c24ea4af8f", "address": "fa:16:3e:4c:86:d3", "network": {"id": "2440340b-a243-4eda-a15d-71178a169a03", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1967301454-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f3230f111af4a7b989f52dd95d9d57f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebe57519-50", "ovs_interfaceid": "ebe57519-501d-4a2e-be01-a8c24ea4af8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:46:15 np0005546954 nova_compute[187160]: 2025-12-05 12:46:15.880 187164 DEBUG nova.network.os_vif_util [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Converting VIF {"id": "ebe57519-501d-4a2e-be01-a8c24ea4af8f", "address": "fa:16:3e:4c:86:d3", "network": {"id": "2440340b-a243-4eda-a15d-71178a169a03", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1967301454-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f3230f111af4a7b989f52dd95d9d57f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebe57519-50", "ovs_interfaceid": "ebe57519-501d-4a2e-be01-a8c24ea4af8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:46:15 np0005546954 nova_compute[187160]: 2025-12-05 12:46:15.881 187164 DEBUG nova.network.os_vif_util [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4c:86:d3,bridge_name='br-int',has_traffic_filtering=True,id=ebe57519-501d-4a2e-be01-a8c24ea4af8f,network=Network(2440340b-a243-4eda-a15d-71178a169a03),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebe57519-50') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:46:15 np0005546954 nova_compute[187160]: 2025-12-05 12:46:15.881 187164 DEBUG os_vif [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4c:86:d3,bridge_name='br-int',has_traffic_filtering=True,id=ebe57519-501d-4a2e-be01-a8c24ea4af8f,network=Network(2440340b-a243-4eda-a15d-71178a169a03),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebe57519-50') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:46:15 np0005546954 nova_compute[187160]: 2025-12-05 12:46:15.882 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:46:15 np0005546954 nova_compute[187160]: 2025-12-05 12:46:15.882 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:46:15 np0005546954 nova_compute[187160]: 2025-12-05 12:46:15.883 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:46:15 np0005546954 nova_compute[187160]: 2025-12-05 12:46:15.886 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:46:15 np0005546954 nova_compute[187160]: 2025-12-05 12:46:15.886 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapebe57519-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:46:15 np0005546954 nova_compute[187160]: 2025-12-05 12:46:15.887 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapebe57519-50, col_values=(('external_ids', {'iface-id': 'ebe57519-501d-4a2e-be01-a8c24ea4af8f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4c:86:d3', 'vm-uuid': 'c0f94a57-0be2-40c1-a4a8-5e04bbbb608d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:46:15 np0005546954 nova_compute[187160]: 2025-12-05 12:46:15.889 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:46:15 np0005546954 NetworkManager[55665]: <info>  [1764938775.8910] manager: (tapebe57519-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/33)
Dec  5 07:46:15 np0005546954 nova_compute[187160]: 2025-12-05 12:46:15.896 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:46:15 np0005546954 nova_compute[187160]: 2025-12-05 12:46:15.897 187164 INFO os_vif [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4c:86:d3,bridge_name='br-int',has_traffic_filtering=True,id=ebe57519-501d-4a2e-be01-a8c24ea4af8f,network=Network(2440340b-a243-4eda-a15d-71178a169a03),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebe57519-50')#033[00m
Dec  5 07:46:16 np0005546954 nova_compute[187160]: 2025-12-05 12:46:16.095 187164 DEBUG nova.virt.libvirt.driver [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:46:16 np0005546954 nova_compute[187160]: 2025-12-05 12:46:16.096 187164 DEBUG nova.virt.libvirt.driver [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:46:16 np0005546954 nova_compute[187160]: 2025-12-05 12:46:16.096 187164 DEBUG nova.virt.libvirt.driver [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] No VIF found with MAC fa:16:3e:4c:86:d3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:46:16 np0005546954 nova_compute[187160]: 2025-12-05 12:46:16.097 187164 INFO nova.virt.libvirt.driver [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Using config drive#033[00m
Dec  5 07:46:16 np0005546954 nova_compute[187160]: 2025-12-05 12:46:16.616 187164 INFO nova.virt.libvirt.driver [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Creating config drive at /var/lib/nova/instances/c0f94a57-0be2-40c1-a4a8-5e04bbbb608d/disk.config#033[00m
Dec  5 07:46:16 np0005546954 nova_compute[187160]: 2025-12-05 12:46:16.622 187164 DEBUG oslo_concurrency.processutils [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c0f94a57-0be2-40c1-a4a8-5e04bbbb608d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwgv4w7p1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:46:16 np0005546954 nova_compute[187160]: 2025-12-05 12:46:16.750 187164 DEBUG oslo_concurrency.processutils [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c0f94a57-0be2-40c1-a4a8-5e04bbbb608d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwgv4w7p1" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:46:16 np0005546954 kernel: tapebe57519-50: entered promiscuous mode
Dec  5 07:46:16 np0005546954 NetworkManager[55665]: <info>  [1764938776.8279] manager: (tapebe57519-50): new Tun device (/org/freedesktop/NetworkManager/Devices/34)
Dec  5 07:46:16 np0005546954 ovn_controller[95566]: 2025-12-05T12:46:16Z|00067|binding|INFO|Claiming lport ebe57519-501d-4a2e-be01-a8c24ea4af8f for this chassis.
Dec  5 07:46:16 np0005546954 ovn_controller[95566]: 2025-12-05T12:46:16Z|00068|binding|INFO|ebe57519-501d-4a2e-be01-a8c24ea4af8f: Claiming fa:16:3e:4c:86:d3 10.100.0.10
Dec  5 07:46:16 np0005546954 nova_compute[187160]: 2025-12-05 12:46:16.830 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:46:16 np0005546954 nova_compute[187160]: 2025-12-05 12:46:16.833 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:46:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:46:16.848 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4c:86:d3 10.100.0.10'], port_security=['fa:16:3e:4c:86:d3 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c0f94a57-0be2-40c1-a4a8-5e04bbbb608d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2440340b-a243-4eda-a15d-71178a169a03', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f3230f111af4a7b989f52dd95d9d57f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8f34e8a7-cd16-49df-a810-0667f963d3a4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b311c115-047d-4871-8652-57d6506f66be, chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=ebe57519-501d-4a2e-be01-a8c24ea4af8f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:46:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:46:16.851 104428 INFO neutron.agent.ovn.metadata.agent [-] Port ebe57519-501d-4a2e-be01-a8c24ea4af8f in datapath 2440340b-a243-4eda-a15d-71178a169a03 bound to our chassis#033[00m
Dec  5 07:46:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:46:16.853 104428 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2440340b-a243-4eda-a15d-71178a169a03#033[00m
Dec  5 07:46:16 np0005546954 systemd-udevd[210195]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:46:16 np0005546954 NetworkManager[55665]: <info>  [1764938776.8691] device (tapebe57519-50): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:46:16 np0005546954 NetworkManager[55665]: <info>  [1764938776.8706] device (tapebe57519-50): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:46:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:46:16.873 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[da0e5a2f-afef-41f1-9640-99d92ce4963a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:46:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:46:16.874 104428 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2440340b-a1 in ovnmeta-2440340b-a243-4eda-a15d-71178a169a03 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:46:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:46:16.876 208690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2440340b-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:46:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:46:16.876 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[fc4f3014-1fba-41e9-9c93-815e409c9a7a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:46:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:46:16.877 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[88dfd09f-eaaa-4571-b830-f78dc20d1d91]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:46:16 np0005546954 systemd-machined[153497]: New machine qemu-6-instance-00000008.
Dec  5 07:46:16 np0005546954 nova_compute[187160]: 2025-12-05 12:46:16.887 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:46:16 np0005546954 ovn_controller[95566]: 2025-12-05T12:46:16Z|00069|binding|INFO|Setting lport ebe57519-501d-4a2e-be01-a8c24ea4af8f ovn-installed in OVS
Dec  5 07:46:16 np0005546954 ovn_controller[95566]: 2025-12-05T12:46:16Z|00070|binding|INFO|Setting lport ebe57519-501d-4a2e-be01-a8c24ea4af8f up in Southbound
Dec  5 07:46:16 np0005546954 nova_compute[187160]: 2025-12-05 12:46:16.894 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:46:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:46:16.894 104542 DEBUG oslo.privsep.daemon [-] privsep: reply[5ae3c798-3cc1-4de0-bca2-794b371b04af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:46:16 np0005546954 systemd[1]: Started Virtual Machine qemu-6-instance-00000008.
Dec  5 07:46:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:46:16.920 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[1f4b0e12-9aee-4bb6-ad03-76644dde3ba1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:46:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:46:16.942 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:46:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:46:16.943 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:46:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:46:16.943 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:46:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:46:16.955 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[66b6f256-453c-40ff-a39e-257b388c5d21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:46:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:46:16.963 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[41f4bb15-649c-4cf0-b582-953e5d06b3c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:46:16 np0005546954 NetworkManager[55665]: <info>  [1764938776.9652] manager: (tap2440340b-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/35)
Dec  5 07:46:17 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:46:17.010 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[cd99d583-a0a9-49c1-b077-2d6d4b337d90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:46:17 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:46:17.014 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[c4dfa037-73da-4d74-9c5f-1b4b80cbd9c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:46:17 np0005546954 NetworkManager[55665]: <info>  [1764938777.0383] device (tap2440340b-a0): carrier: link connected
Dec  5 07:46:17 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:46:17.045 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[6d18b1fd-a015-4080-98e4-a66c71bb63aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:46:17 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:46:17.064 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[5a5e22ab-a610-4304-9924-fb42c9687d37]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2440340b-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4d:f7:82'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 378366, 'reachable_time': 21487, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210231, 'error': None, 'target': 'ovnmeta-2440340b-a243-4eda-a15d-71178a169a03', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:46:17 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:46:17.081 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[c74f6acc-ea10-4976-9846-6a8cb6d52bd7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4d:f782'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 378366, 'tstamp': 378366}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210232, 'error': None, 'target': 'ovnmeta-2440340b-a243-4eda-a15d-71178a169a03', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:46:17 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:46:17.099 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[2814eaa4-7806-402a-9036-0a370517560a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2440340b-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4d:f7:82'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 378366, 'reachable_time': 21487, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 210233, 'error': None, 'target': 'ovnmeta-2440340b-a243-4eda-a15d-71178a169a03', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:46:17 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:46:17.133 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[e699a45b-a562-4390-a5bc-bfddcd681a63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:46:17 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:46:17.198 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[07ae5300-399c-421a-99f1-d41511662ea7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:46:17 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:46:17.200 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2440340b-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:46:17 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:46:17.201 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:46:17 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:46:17.201 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2440340b-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:46:17 np0005546954 NetworkManager[55665]: <info>  [1764938777.2043] manager: (tap2440340b-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Dec  5 07:46:17 np0005546954 kernel: tap2440340b-a0: entered promiscuous mode
Dec  5 07:46:17 np0005546954 nova_compute[187160]: 2025-12-05 12:46:17.205 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:46:17 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:46:17.207 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2440340b-a0, col_values=(('external_ids', {'iface-id': '6804ecdc-7520-4ec9-b54b-c1c436be63da'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:46:17 np0005546954 nova_compute[187160]: 2025-12-05 12:46:17.208 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:46:17 np0005546954 ovn_controller[95566]: 2025-12-05T12:46:17Z|00071|binding|INFO|Releasing lport 6804ecdc-7520-4ec9-b54b-c1c436be63da from this chassis (sb_readonly=0)
Dec  5 07:46:17 np0005546954 nova_compute[187160]: 2025-12-05 12:46:17.233 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:46:17 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:46:17.234 104428 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2440340b-a243-4eda-a15d-71178a169a03.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2440340b-a243-4eda-a15d-71178a169a03.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:46:17 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:46:17.236 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[53dadb14-e2a8-44af-b267-46650b30d0ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:46:17 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:46:17.237 104428 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:46:17 np0005546954 ovn_metadata_agent[104423]: global
Dec  5 07:46:17 np0005546954 ovn_metadata_agent[104423]:    log         /dev/log local0 debug
Dec  5 07:46:17 np0005546954 ovn_metadata_agent[104423]:    log-tag     haproxy-metadata-proxy-2440340b-a243-4eda-a15d-71178a169a03
Dec  5 07:46:17 np0005546954 ovn_metadata_agent[104423]:    user        root
Dec  5 07:46:17 np0005546954 ovn_metadata_agent[104423]:    group       root
Dec  5 07:46:17 np0005546954 ovn_metadata_agent[104423]:    maxconn     1024
Dec  5 07:46:17 np0005546954 ovn_metadata_agent[104423]:    pidfile     /var/lib/neutron/external/pids/2440340b-a243-4eda-a15d-71178a169a03.pid.haproxy
Dec  5 07:46:17 np0005546954 ovn_metadata_agent[104423]:    daemon
Dec  5 07:46:17 np0005546954 ovn_metadata_agent[104423]: 
Dec  5 07:46:17 np0005546954 ovn_metadata_agent[104423]: defaults
Dec  5 07:46:17 np0005546954 ovn_metadata_agent[104423]:    log global
Dec  5 07:46:17 np0005546954 ovn_metadata_agent[104423]:    mode http
Dec  5 07:46:17 np0005546954 ovn_metadata_agent[104423]:    option httplog
Dec  5 07:46:17 np0005546954 ovn_metadata_agent[104423]:    option dontlognull
Dec  5 07:46:17 np0005546954 ovn_metadata_agent[104423]:    option http-server-close
Dec  5 07:46:17 np0005546954 ovn_metadata_agent[104423]:    option forwardfor
Dec  5 07:46:17 np0005546954 ovn_metadata_agent[104423]:    retries                 3
Dec  5 07:46:17 np0005546954 ovn_metadata_agent[104423]:    timeout http-request    30s
Dec  5 07:46:17 np0005546954 ovn_metadata_agent[104423]:    timeout connect         30s
Dec  5 07:46:17 np0005546954 ovn_metadata_agent[104423]:    timeout client          32s
Dec  5 07:46:17 np0005546954 ovn_metadata_agent[104423]:    timeout server          32s
Dec  5 07:46:17 np0005546954 ovn_metadata_agent[104423]:    timeout http-keep-alive 30s
Dec  5 07:46:17 np0005546954 ovn_metadata_agent[104423]: 
Dec  5 07:46:17 np0005546954 ovn_metadata_agent[104423]: 
Dec  5 07:46:17 np0005546954 ovn_metadata_agent[104423]: listen listener
Dec  5 07:46:17 np0005546954 ovn_metadata_agent[104423]:    bind 169.254.169.254:80
Dec  5 07:46:17 np0005546954 ovn_metadata_agent[104423]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:46:17 np0005546954 ovn_metadata_agent[104423]:    http-request add-header X-OVN-Network-ID 2440340b-a243-4eda-a15d-71178a169a03
Dec  5 07:46:17 np0005546954 ovn_metadata_agent[104423]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:46:17 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:46:17.239 104428 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2440340b-a243-4eda-a15d-71178a169a03', 'env', 'PROCESS_TAG=haproxy-2440340b-a243-4eda-a15d-71178a169a03', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2440340b-a243-4eda-a15d-71178a169a03.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:46:17 np0005546954 podman[210263]: 2025-12-05 12:46:17.627085555 +0000 UTC m=+0.072396353 container create c3e95528b64d59db131bfe7c2defbc0d1465fd9498653ea786ffcafe4cddbbff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2440340b-a243-4eda-a15d-71178a169a03, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec  5 07:46:17 np0005546954 systemd[1]: Started libpod-conmon-c3e95528b64d59db131bfe7c2defbc0d1465fd9498653ea786ffcafe4cddbbff.scope.
Dec  5 07:46:17 np0005546954 podman[210263]: 2025-12-05 12:46:17.580578392 +0000 UTC m=+0.025889220 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:46:17 np0005546954 nova_compute[187160]: 2025-12-05 12:46:17.684 187164 DEBUG nova.compute.manager [req-225d971d-7d77-4a39-ba97-3c7eaa64d4f5 req-d3cc2300-aaac-4931-a299-42032fa8fdef 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Received event network-vif-plugged-ebe57519-501d-4a2e-be01-a8c24ea4af8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:46:17 np0005546954 nova_compute[187160]: 2025-12-05 12:46:17.684 187164 DEBUG oslo_concurrency.lockutils [req-225d971d-7d77-4a39-ba97-3c7eaa64d4f5 req-d3cc2300-aaac-4931-a299-42032fa8fdef 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "c0f94a57-0be2-40c1-a4a8-5e04bbbb608d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:46:17 np0005546954 nova_compute[187160]: 2025-12-05 12:46:17.685 187164 DEBUG oslo_concurrency.lockutils [req-225d971d-7d77-4a39-ba97-3c7eaa64d4f5 req-d3cc2300-aaac-4931-a299-42032fa8fdef 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "c0f94a57-0be2-40c1-a4a8-5e04bbbb608d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:46:17 np0005546954 nova_compute[187160]: 2025-12-05 12:46:17.685 187164 DEBUG oslo_concurrency.lockutils [req-225d971d-7d77-4a39-ba97-3c7eaa64d4f5 req-d3cc2300-aaac-4931-a299-42032fa8fdef 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "c0f94a57-0be2-40c1-a4a8-5e04bbbb608d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:46:17 np0005546954 nova_compute[187160]: 2025-12-05 12:46:17.685 187164 DEBUG nova.compute.manager [req-225d971d-7d77-4a39-ba97-3c7eaa64d4f5 req-d3cc2300-aaac-4931-a299-42032fa8fdef 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Processing event network-vif-plugged-ebe57519-501d-4a2e-be01-a8c24ea4af8f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:46:17 np0005546954 systemd[1]: Started libcrun container.
Dec  5 07:46:17 np0005546954 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5bb38938fb4079d52dc35ec20d1a5122f5d6a17b332a4128ca4a0179d79a4200/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:46:17 np0005546954 podman[210263]: 2025-12-05 12:46:17.728914919 +0000 UTC m=+0.174225727 container init c3e95528b64d59db131bfe7c2defbc0d1465fd9498653ea786ffcafe4cddbbff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2440340b-a243-4eda-a15d-71178a169a03, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec  5 07:46:17 np0005546954 nova_compute[187160]: 2025-12-05 12:46:17.728 187164 DEBUG nova.compute.manager [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:46:17 np0005546954 nova_compute[187160]: 2025-12-05 12:46:17.729 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764938777.7277966, c0f94a57-0be2-40c1-a4a8-5e04bbbb608d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:46:17 np0005546954 nova_compute[187160]: 2025-12-05 12:46:17.730 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] VM Started (Lifecycle Event)#033[00m
Dec  5 07:46:17 np0005546954 podman[210263]: 2025-12-05 12:46:17.734856985 +0000 UTC m=+0.180167783 container start c3e95528b64d59db131bfe7c2defbc0d1465fd9498653ea786ffcafe4cddbbff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2440340b-a243-4eda-a15d-71178a169a03, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:46:17 np0005546954 nova_compute[187160]: 2025-12-05 12:46:17.734 187164 DEBUG nova.virt.libvirt.driver [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:46:17 np0005546954 nova_compute[187160]: 2025-12-05 12:46:17.739 187164 INFO nova.virt.libvirt.driver [-] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Instance spawned successfully.#033[00m
Dec  5 07:46:17 np0005546954 nova_compute[187160]: 2025-12-05 12:46:17.740 187164 DEBUG nova.virt.libvirt.driver [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:46:17 np0005546954 nova_compute[187160]: 2025-12-05 12:46:17.744 187164 DEBUG nova.network.neutron [req-4f03f3a1-904e-4a48-baa7-99fda04a527f req-a731d0ba-fc65-4916-a7da-0eb26ccca81c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Updated VIF entry in instance network info cache for port ebe57519-501d-4a2e-be01-a8c24ea4af8f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:46:17 np0005546954 nova_compute[187160]: 2025-12-05 12:46:17.745 187164 DEBUG nova.network.neutron [req-4f03f3a1-904e-4a48-baa7-99fda04a527f req-a731d0ba-fc65-4916-a7da-0eb26ccca81c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Updating instance_info_cache with network_info: [{"id": "ebe57519-501d-4a2e-be01-a8c24ea4af8f", "address": "fa:16:3e:4c:86:d3", "network": {"id": "2440340b-a243-4eda-a15d-71178a169a03", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1967301454-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f3230f111af4a7b989f52dd95d9d57f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebe57519-50", "ovs_interfaceid": "ebe57519-501d-4a2e-be01-a8c24ea4af8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:46:17 np0005546954 neutron-haproxy-ovnmeta-2440340b-a243-4eda-a15d-71178a169a03[210285]: [NOTICE]   (210289) : New worker (210291) forked
Dec  5 07:46:17 np0005546954 neutron-haproxy-ovnmeta-2440340b-a243-4eda-a15d-71178a169a03[210285]: [NOTICE]   (210289) : Loading success.
Dec  5 07:46:17 np0005546954 nova_compute[187160]: 2025-12-05 12:46:17.776 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:46:17 np0005546954 nova_compute[187160]: 2025-12-05 12:46:17.786 187164 DEBUG oslo_concurrency.lockutils [req-4f03f3a1-904e-4a48-baa7-99fda04a527f req-a731d0ba-fc65-4916-a7da-0eb26ccca81c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Releasing lock "refresh_cache-c0f94a57-0be2-40c1-a4a8-5e04bbbb608d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:46:17 np0005546954 nova_compute[187160]: 2025-12-05 12:46:17.789 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:46:17 np0005546954 nova_compute[187160]: 2025-12-05 12:46:17.797 187164 DEBUG nova.virt.libvirt.driver [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:46:17 np0005546954 nova_compute[187160]: 2025-12-05 12:46:17.798 187164 DEBUG nova.virt.libvirt.driver [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:46:17 np0005546954 nova_compute[187160]: 2025-12-05 12:46:17.799 187164 DEBUG nova.virt.libvirt.driver [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:46:17 np0005546954 nova_compute[187160]: 2025-12-05 12:46:17.800 187164 DEBUG nova.virt.libvirt.driver [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:46:17 np0005546954 nova_compute[187160]: 2025-12-05 12:46:17.800 187164 DEBUG nova.virt.libvirt.driver [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:46:17 np0005546954 nova_compute[187160]: 2025-12-05 12:46:17.801 187164 DEBUG nova.virt.libvirt.driver [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:46:18 np0005546954 nova_compute[187160]: 2025-12-05 12:46:18.061 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:46:18 np0005546954 nova_compute[187160]: 2025-12-05 12:46:18.062 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764938777.7303243, c0f94a57-0be2-40c1-a4a8-5e04bbbb608d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:46:18 np0005546954 nova_compute[187160]: 2025-12-05 12:46:18.062 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:46:18 np0005546954 nova_compute[187160]: 2025-12-05 12:46:18.087 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:46:18 np0005546954 nova_compute[187160]: 2025-12-05 12:46:18.090 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764938777.7333457, c0f94a57-0be2-40c1-a4a8-5e04bbbb608d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:46:18 np0005546954 nova_compute[187160]: 2025-12-05 12:46:18.090 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:46:18 np0005546954 nova_compute[187160]: 2025-12-05 12:46:18.096 187164 INFO nova.compute.manager [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Took 10.86 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:46:18 np0005546954 nova_compute[187160]: 2025-12-05 12:46:18.097 187164 DEBUG nova.compute.manager [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:46:18 np0005546954 nova_compute[187160]: 2025-12-05 12:46:18.107 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:46:18 np0005546954 nova_compute[187160]: 2025-12-05 12:46:18.110 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:46:18 np0005546954 nova_compute[187160]: 2025-12-05 12:46:18.137 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:46:18 np0005546954 nova_compute[187160]: 2025-12-05 12:46:18.164 187164 INFO nova.compute.manager [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Took 11.35 seconds to build instance.#033[00m
Dec  5 07:46:18 np0005546954 nova_compute[187160]: 2025-12-05 12:46:18.184 187164 DEBUG oslo_concurrency.lockutils [None req-39ccfe9b-0a03-44ba-b74f-d31d4e078975 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Lock "c0f94a57-0be2-40c1-a4a8-5e04bbbb608d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.454s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:46:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:46:18.420 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2a:56:46', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '32:90:88:ab:74:32'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:46:18 np0005546954 nova_compute[187160]: 2025-12-05 12:46:18.422 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:46:18 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:46:18.426 104428 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 07:46:19 np0005546954 nova_compute[187160]: 2025-12-05 12:46:19.203 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:46:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:46:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 07:46:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:46:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:46:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:46:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:46:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:46:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 07:46:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:46:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:46:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 07:46:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:46:19 np0005546954 nova_compute[187160]: 2025-12-05 12:46:19.795 187164 DEBUG nova.compute.manager [req-025a6be7-be98-40d9-ac38-f1349c4144ee req-6e5cf1a5-e213-44db-af60-7e64692208c8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Received event network-vif-plugged-ebe57519-501d-4a2e-be01-a8c24ea4af8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:46:19 np0005546954 nova_compute[187160]: 2025-12-05 12:46:19.796 187164 DEBUG oslo_concurrency.lockutils [req-025a6be7-be98-40d9-ac38-f1349c4144ee req-6e5cf1a5-e213-44db-af60-7e64692208c8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "c0f94a57-0be2-40c1-a4a8-5e04bbbb608d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:46:19 np0005546954 nova_compute[187160]: 2025-12-05 12:46:19.796 187164 DEBUG oslo_concurrency.lockutils [req-025a6be7-be98-40d9-ac38-f1349c4144ee req-6e5cf1a5-e213-44db-af60-7e64692208c8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "c0f94a57-0be2-40c1-a4a8-5e04bbbb608d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:46:19 np0005546954 nova_compute[187160]: 2025-12-05 12:46:19.796 187164 DEBUG oslo_concurrency.lockutils [req-025a6be7-be98-40d9-ac38-f1349c4144ee req-6e5cf1a5-e213-44db-af60-7e64692208c8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "c0f94a57-0be2-40c1-a4a8-5e04bbbb608d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:46:19 np0005546954 nova_compute[187160]: 2025-12-05 12:46:19.797 187164 DEBUG nova.compute.manager [req-025a6be7-be98-40d9-ac38-f1349c4144ee req-6e5cf1a5-e213-44db-af60-7e64692208c8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] No waiting events found dispatching network-vif-plugged-ebe57519-501d-4a2e-be01-a8c24ea4af8f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:46:19 np0005546954 nova_compute[187160]: 2025-12-05 12:46:19.797 187164 WARNING nova.compute.manager [req-025a6be7-be98-40d9-ac38-f1349c4144ee req-6e5cf1a5-e213-44db-af60-7e64692208c8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Received unexpected event network-vif-plugged-ebe57519-501d-4a2e-be01-a8c24ea4af8f for instance with vm_state active and task_state None.#033[00m
Dec  5 07:46:20 np0005546954 nova_compute[187160]: 2025-12-05 12:46:20.891 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:46:24 np0005546954 nova_compute[187160]: 2025-12-05 12:46:24.206 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:46:24 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:46:24.433 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f9f74c-08f9-451f-9678-93bb9e8fa2fe, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:46:25 np0005546954 nova_compute[187160]: 2025-12-05 12:46:25.895 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:46:27 np0005546954 podman[210301]: 2025-12-05 12:46:27.587664657 +0000 UTC m=+0.087421173 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.build-date=20251125, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:46:27 np0005546954 podman[210300]: 2025-12-05 12:46:27.595339458 +0000 UTC m=+0.094045722 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., release=1755695350, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, container_name=openstack_network_exporter, vcs-type=git, version=9.6, architecture=x86_64, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec  5 07:46:29 np0005546954 nova_compute[187160]: 2025-12-05 12:46:29.209 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:46:29 np0005546954 ovn_controller[95566]: 2025-12-05T12:46:29Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:4c:86:d3 10.100.0.10
Dec  5 07:46:29 np0005546954 ovn_controller[95566]: 2025-12-05T12:46:29Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:4c:86:d3 10.100.0.10
Dec  5 07:46:30 np0005546954 nova_compute[187160]: 2025-12-05 12:46:30.899 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:46:34 np0005546954 nova_compute[187160]: 2025-12-05 12:46:34.212 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:46:35 np0005546954 podman[197513]: time="2025-12-05T12:46:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:46:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:46:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  5 07:46:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:46:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3049 "" "Go-http-client/1.1"
Dec  5 07:46:35 np0005546954 nova_compute[187160]: 2025-12-05 12:46:35.905 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:46:39 np0005546954 nova_compute[187160]: 2025-12-05 12:46:39.213 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:46:40 np0005546954 podman[210357]: 2025-12-05 12:46:40.55072828 +0000 UTC m=+0.060709955 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec  5 07:46:40 np0005546954 nova_compute[187160]: 2025-12-05 12:46:40.909 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:46:44 np0005546954 nova_compute[187160]: 2025-12-05 12:46:44.216 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:46:44 np0005546954 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec  5 07:46:44 np0005546954 podman[210381]: 2025-12-05 12:46:44.983087561 +0000 UTC m=+0.059905140 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  5 07:46:45 np0005546954 podman[210380]: 2025-12-05 12:46:45.038441318 +0000 UTC m=+0.121936216 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec  5 07:46:45 np0005546954 nova_compute[187160]: 2025-12-05 12:46:45.913 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:46:47 np0005546954 ovn_controller[95566]: 2025-12-05T12:46:47Z|00072|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Dec  5 07:46:49 np0005546954 nova_compute[187160]: 2025-12-05 12:46:49.219 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:46:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:46:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 07:46:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:46:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:46:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:46:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:46:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:46:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 07:46:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:46:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:46:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 07:46:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:46:50 np0005546954 nova_compute[187160]: 2025-12-05 12:46:50.916 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:46:52 np0005546954 nova_compute[187160]: 2025-12-05 12:46:52.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:46:52 np0005546954 nova_compute[187160]: 2025-12-05 12:46:52.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:46:52 np0005546954 nova_compute[187160]: 2025-12-05 12:46:52.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:46:52 np0005546954 nova_compute[187160]: 2025-12-05 12:46:52.365 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "refresh_cache-c0f94a57-0be2-40c1-a4a8-5e04bbbb608d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:46:52 np0005546954 nova_compute[187160]: 2025-12-05 12:46:52.366 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquired lock "refresh_cache-c0f94a57-0be2-40c1-a4a8-5e04bbbb608d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:46:52 np0005546954 nova_compute[187160]: 2025-12-05 12:46:52.366 187164 DEBUG nova.network.neutron [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec  5 07:46:52 np0005546954 nova_compute[187160]: 2025-12-05 12:46:52.367 187164 DEBUG nova.objects.instance [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c0f94a57-0be2-40c1-a4a8-5e04bbbb608d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:46:53 np0005546954 nova_compute[187160]: 2025-12-05 12:46:53.805 187164 DEBUG nova.network.neutron [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Updating instance_info_cache with network_info: [{"id": "ebe57519-501d-4a2e-be01-a8c24ea4af8f", "address": "fa:16:3e:4c:86:d3", "network": {"id": "2440340b-a243-4eda-a15d-71178a169a03", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1967301454-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f3230f111af4a7b989f52dd95d9d57f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebe57519-50", "ovs_interfaceid": "ebe57519-501d-4a2e-be01-a8c24ea4af8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:46:54 np0005546954 nova_compute[187160]: 2025-12-05 12:46:54.019 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Releasing lock "refresh_cache-c0f94a57-0be2-40c1-a4a8-5e04bbbb608d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:46:54 np0005546954 nova_compute[187160]: 2025-12-05 12:46:54.020 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec  5 07:46:54 np0005546954 nova_compute[187160]: 2025-12-05 12:46:54.021 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:46:54 np0005546954 nova_compute[187160]: 2025-12-05 12:46:54.223 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:46:55 np0005546954 nova_compute[187160]: 2025-12-05 12:46:55.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:46:55 np0005546954 nova_compute[187160]: 2025-12-05 12:46:55.067 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:46:55 np0005546954 nova_compute[187160]: 2025-12-05 12:46:55.920 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:46:56 np0005546954 nova_compute[187160]: 2025-12-05 12:46:56.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:46:57 np0005546954 nova_compute[187160]: 2025-12-05 12:46:57.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:46:57 np0005546954 nova_compute[187160]: 2025-12-05 12:46:57.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:46:58 np0005546954 nova_compute[187160]: 2025-12-05 12:46:58.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:46:58 np0005546954 nova_compute[187160]: 2025-12-05 12:46:58.070 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:46:58 np0005546954 nova_compute[187160]: 2025-12-05 12:46:58.070 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:46:58 np0005546954 nova_compute[187160]: 2025-12-05 12:46:58.071 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:46:58 np0005546954 nova_compute[187160]: 2025-12-05 12:46:58.071 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:46:58 np0005546954 nova_compute[187160]: 2025-12-05 12:46:58.151 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0f94a57-0be2-40c1-a4a8-5e04bbbb608d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:46:58 np0005546954 podman[210432]: 2025-12-05 12:46:58.196808786 +0000 UTC m=+0.068646295 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3)
Dec  5 07:46:58 np0005546954 podman[210430]: 2025-12-05 12:46:58.199267403 +0000 UTC m=+0.072803375 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, name=ubi9-minimal, container_name=openstack_network_exporter, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_id=edpm)
Dec  5 07:46:58 np0005546954 nova_compute[187160]: 2025-12-05 12:46:58.227 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0f94a57-0be2-40c1-a4a8-5e04bbbb608d/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:46:58 np0005546954 nova_compute[187160]: 2025-12-05 12:46:58.229 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0f94a57-0be2-40c1-a4a8-5e04bbbb608d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:46:58 np0005546954 nova_compute[187160]: 2025-12-05 12:46:58.291 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0f94a57-0be2-40c1-a4a8-5e04bbbb608d/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:46:58 np0005546954 nova_compute[187160]: 2025-12-05 12:46:58.464 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:46:58 np0005546954 nova_compute[187160]: 2025-12-05 12:46:58.465 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5663MB free_disk=73.31147003173828GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:46:58 np0005546954 nova_compute[187160]: 2025-12-05 12:46:58.466 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:46:58 np0005546954 nova_compute[187160]: 2025-12-05 12:46:58.466 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:46:58 np0005546954 nova_compute[187160]: 2025-12-05 12:46:58.540 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Instance c0f94a57-0be2-40c1-a4a8-5e04bbbb608d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:46:58 np0005546954 nova_compute[187160]: 2025-12-05 12:46:58.541 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:46:58 np0005546954 nova_compute[187160]: 2025-12-05 12:46:58.541 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:46:58 np0005546954 nova_compute[187160]: 2025-12-05 12:46:58.588 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:46:58 np0005546954 nova_compute[187160]: 2025-12-05 12:46:58.604 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:46:58 np0005546954 nova_compute[187160]: 2025-12-05 12:46:58.663 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:46:58 np0005546954 nova_compute[187160]: 2025-12-05 12:46:58.663 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.197s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:46:59 np0005546954 nova_compute[187160]: 2025-12-05 12:46:59.226 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:46:59 np0005546954 nova_compute[187160]: 2025-12-05 12:46:59.660 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:47:00 np0005546954 nova_compute[187160]: 2025-12-05 12:47:00.923 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:01 np0005546954 nova_compute[187160]: 2025-12-05 12:47:01.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:47:02 np0005546954 nova_compute[187160]: 2025-12-05 12:47:02.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:47:04 np0005546954 nova_compute[187160]: 2025-12-05 12:47:04.229 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:04 np0005546954 nova_compute[187160]: 2025-12-05 12:47:04.753 187164 DEBUG nova.virt.libvirt.driver [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4ace353f-ec30-46cb-9906-7b66b0f752a6] Creating tmpfile /var/lib/nova/instances/tmptjoidb5d to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Dec  5 07:47:04 np0005546954 nova_compute[187160]: 2025-12-05 12:47:04.754 187164 DEBUG nova.compute.manager [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmptjoidb5d',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Dec  5 07:47:05 np0005546954 podman[197513]: time="2025-12-05T12:47:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:47:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:47:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  5 07:47:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:47:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3042 "" "Go-http-client/1.1"
Dec  5 07:47:05 np0005546954 nova_compute[187160]: 2025-12-05 12:47:05.903 187164 DEBUG nova.compute.manager [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmptjoidb5d',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4ace353f-ec30-46cb-9906-7b66b0f752a6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Dec  5 07:47:05 np0005546954 nova_compute[187160]: 2025-12-05 12:47:05.928 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:05 np0005546954 nova_compute[187160]: 2025-12-05 12:47:05.932 187164 DEBUG oslo_concurrency.lockutils [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "refresh_cache-4ace353f-ec30-46cb-9906-7b66b0f752a6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:47:05 np0005546954 nova_compute[187160]: 2025-12-05 12:47:05.932 187164 DEBUG oslo_concurrency.lockutils [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquired lock "refresh_cache-4ace353f-ec30-46cb-9906-7b66b0f752a6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:47:05 np0005546954 nova_compute[187160]: 2025-12-05 12:47:05.932 187164 DEBUG nova.network.neutron [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4ace353f-ec30-46cb-9906-7b66b0f752a6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:47:07 np0005546954 nova_compute[187160]: 2025-12-05 12:47:07.389 187164 DEBUG nova.network.neutron [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4ace353f-ec30-46cb-9906-7b66b0f752a6] Updating instance_info_cache with network_info: [{"id": "3cfce5e6-e2b3-48b1-8c76-a31b5e3fc2b2", "address": "fa:16:3e:eb:b8:79", "network": {"id": "2440340b-a243-4eda-a15d-71178a169a03", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1967301454-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f3230f111af4a7b989f52dd95d9d57f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cfce5e6-e2", "ovs_interfaceid": "3cfce5e6-e2b3-48b1-8c76-a31b5e3fc2b2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:47:07 np0005546954 nova_compute[187160]: 2025-12-05 12:47:07.412 187164 DEBUG oslo_concurrency.lockutils [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Releasing lock "refresh_cache-4ace353f-ec30-46cb-9906-7b66b0f752a6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:47:07 np0005546954 nova_compute[187160]: 2025-12-05 12:47:07.415 187164 DEBUG nova.virt.libvirt.driver [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4ace353f-ec30-46cb-9906-7b66b0f752a6] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmptjoidb5d',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4ace353f-ec30-46cb-9906-7b66b0f752a6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Dec  5 07:47:07 np0005546954 nova_compute[187160]: 2025-12-05 12:47:07.416 187164 DEBUG nova.virt.libvirt.driver [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4ace353f-ec30-46cb-9906-7b66b0f752a6] Creating instance directory: /var/lib/nova/instances/4ace353f-ec30-46cb-9906-7b66b0f752a6 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Dec  5 07:47:07 np0005546954 nova_compute[187160]: 2025-12-05 12:47:07.417 187164 DEBUG nova.virt.libvirt.driver [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4ace353f-ec30-46cb-9906-7b66b0f752a6] Creating disk.info with the contents: {'/var/lib/nova/instances/4ace353f-ec30-46cb-9906-7b66b0f752a6/disk': 'qcow2', '/var/lib/nova/instances/4ace353f-ec30-46cb-9906-7b66b0f752a6/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Dec  5 07:47:07 np0005546954 nova_compute[187160]: 2025-12-05 12:47:07.418 187164 DEBUG nova.virt.libvirt.driver [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4ace353f-ec30-46cb-9906-7b66b0f752a6] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Dec  5 07:47:07 np0005546954 nova_compute[187160]: 2025-12-05 12:47:07.419 187164 DEBUG nova.objects.instance [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lazy-loading 'trusted_certs' on Instance uuid 4ace353f-ec30-46cb-9906-7b66b0f752a6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:47:07 np0005546954 nova_compute[187160]: 2025-12-05 12:47:07.459 187164 DEBUG oslo_concurrency.processutils [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:47:07 np0005546954 nova_compute[187160]: 2025-12-05 12:47:07.521 187164 DEBUG oslo_concurrency.processutils [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:47:07 np0005546954 nova_compute[187160]: 2025-12-05 12:47:07.522 187164 DEBUG oslo_concurrency.lockutils [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:47:07 np0005546954 nova_compute[187160]: 2025-12-05 12:47:07.523 187164 DEBUG oslo_concurrency.lockutils [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:47:07 np0005546954 nova_compute[187160]: 2025-12-05 12:47:07.548 187164 DEBUG oslo_concurrency.processutils [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:47:07 np0005546954 nova_compute[187160]: 2025-12-05 12:47:07.611 187164 DEBUG oslo_concurrency.processutils [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:47:07 np0005546954 nova_compute[187160]: 2025-12-05 12:47:07.612 187164 DEBUG oslo_concurrency.processutils [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/4ace353f-ec30-46cb-9906-7b66b0f752a6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:47:07 np0005546954 nova_compute[187160]: 2025-12-05 12:47:07.646 187164 DEBUG oslo_concurrency.processutils [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/4ace353f-ec30-46cb-9906-7b66b0f752a6/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:47:07 np0005546954 nova_compute[187160]: 2025-12-05 12:47:07.648 187164 DEBUG oslo_concurrency.lockutils [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:47:07 np0005546954 nova_compute[187160]: 2025-12-05 12:47:07.648 187164 DEBUG oslo_concurrency.processutils [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:47:07 np0005546954 nova_compute[187160]: 2025-12-05 12:47:07.708 187164 DEBUG oslo_concurrency.processutils [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:47:07 np0005546954 nova_compute[187160]: 2025-12-05 12:47:07.709 187164 DEBUG nova.virt.disk.api [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Checking if we can resize image /var/lib/nova/instances/4ace353f-ec30-46cb-9906-7b66b0f752a6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:47:07 np0005546954 nova_compute[187160]: 2025-12-05 12:47:07.710 187164 DEBUG oslo_concurrency.processutils [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4ace353f-ec30-46cb-9906-7b66b0f752a6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:47:07 np0005546954 nova_compute[187160]: 2025-12-05 12:47:07.771 187164 DEBUG oslo_concurrency.processutils [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4ace353f-ec30-46cb-9906-7b66b0f752a6/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:47:07 np0005546954 nova_compute[187160]: 2025-12-05 12:47:07.772 187164 DEBUG nova.virt.disk.api [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Cannot resize image /var/lib/nova/instances/4ace353f-ec30-46cb-9906-7b66b0f752a6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:47:07 np0005546954 nova_compute[187160]: 2025-12-05 12:47:07.773 187164 DEBUG nova.objects.instance [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lazy-loading 'migration_context' on Instance uuid 4ace353f-ec30-46cb-9906-7b66b0f752a6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:47:07 np0005546954 nova_compute[187160]: 2025-12-05 12:47:07.788 187164 DEBUG oslo_concurrency.processutils [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/4ace353f-ec30-46cb-9906-7b66b0f752a6/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:47:07 np0005546954 nova_compute[187160]: 2025-12-05 12:47:07.821 187164 DEBUG oslo_concurrency.processutils [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/4ace353f-ec30-46cb-9906-7b66b0f752a6/disk.config 485376" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:47:07 np0005546954 nova_compute[187160]: 2025-12-05 12:47:07.823 187164 DEBUG nova.virt.libvirt.volume.remotefs [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/4ace353f-ec30-46cb-9906-7b66b0f752a6/disk.config to /var/lib/nova/instances/4ace353f-ec30-46cb-9906-7b66b0f752a6 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Dec  5 07:47:07 np0005546954 nova_compute[187160]: 2025-12-05 12:47:07.824 187164 DEBUG oslo_concurrency.processutils [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/4ace353f-ec30-46cb-9906-7b66b0f752a6/disk.config /var/lib/nova/instances/4ace353f-ec30-46cb-9906-7b66b0f752a6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:47:08 np0005546954 nova_compute[187160]: 2025-12-05 12:47:08.244 187164 DEBUG oslo_concurrency.processutils [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/4ace353f-ec30-46cb-9906-7b66b0f752a6/disk.config /var/lib/nova/instances/4ace353f-ec30-46cb-9906-7b66b0f752a6" returned: 0 in 0.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:47:08 np0005546954 nova_compute[187160]: 2025-12-05 12:47:08.245 187164 DEBUG nova.virt.libvirt.driver [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4ace353f-ec30-46cb-9906-7b66b0f752a6] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Dec  5 07:47:08 np0005546954 nova_compute[187160]: 2025-12-05 12:47:08.246 187164 DEBUG nova.virt.libvirt.vif [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:45:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-1530812500',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-1530812500',id=7,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:46:00Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='8f3230f111af4a7b989f52dd95d9d57f',ramdisk_id='',reservation_id='r-7gjdsrre',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-2049426111',owner_user_name='tempest-TestExecuteBasicStrategy-2049426111-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:46:00Z,user_data=None,user_id='a0f9474025d345019200ba286c9b5bf1',uuid=4ace353f-ec30-46cb-9906-7b66b0f752a6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3cfce5e6-e2b3-48b1-8c76-a31b5e3fc2b2", "address": "fa:16:3e:eb:b8:79", "network": {"id": "2440340b-a243-4eda-a15d-71178a169a03", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1967301454-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f3230f111af4a7b989f52dd95d9d57f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3cfce5e6-e2", "ovs_interfaceid": "3cfce5e6-e2b3-48b1-8c76-a31b5e3fc2b2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:47:08 np0005546954 nova_compute[187160]: 2025-12-05 12:47:08.247 187164 DEBUG nova.network.os_vif_util [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Converting VIF {"id": "3cfce5e6-e2b3-48b1-8c76-a31b5e3fc2b2", "address": "fa:16:3e:eb:b8:79", "network": {"id": "2440340b-a243-4eda-a15d-71178a169a03", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1967301454-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f3230f111af4a7b989f52dd95d9d57f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3cfce5e6-e2", "ovs_interfaceid": "3cfce5e6-e2b3-48b1-8c76-a31b5e3fc2b2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:47:08 np0005546954 nova_compute[187160]: 2025-12-05 12:47:08.248 187164 DEBUG nova.network.os_vif_util [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:eb:b8:79,bridge_name='br-int',has_traffic_filtering=True,id=3cfce5e6-e2b3-48b1-8c76-a31b5e3fc2b2,network=Network(2440340b-a243-4eda-a15d-71178a169a03),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cfce5e6-e2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:47:08 np0005546954 nova_compute[187160]: 2025-12-05 12:47:08.248 187164 DEBUG os_vif [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:eb:b8:79,bridge_name='br-int',has_traffic_filtering=True,id=3cfce5e6-e2b3-48b1-8c76-a31b5e3fc2b2,network=Network(2440340b-a243-4eda-a15d-71178a169a03),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cfce5e6-e2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:47:08 np0005546954 nova_compute[187160]: 2025-12-05 12:47:08.249 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:08 np0005546954 nova_compute[187160]: 2025-12-05 12:47:08.250 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:47:08 np0005546954 nova_compute[187160]: 2025-12-05 12:47:08.250 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:47:08 np0005546954 nova_compute[187160]: 2025-12-05 12:47:08.257 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:08 np0005546954 nova_compute[187160]: 2025-12-05 12:47:08.257 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3cfce5e6-e2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:47:08 np0005546954 nova_compute[187160]: 2025-12-05 12:47:08.258 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3cfce5e6-e2, col_values=(('external_ids', {'iface-id': '3cfce5e6-e2b3-48b1-8c76-a31b5e3fc2b2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:eb:b8:79', 'vm-uuid': '4ace353f-ec30-46cb-9906-7b66b0f752a6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:47:08 np0005546954 nova_compute[187160]: 2025-12-05 12:47:08.289 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:08 np0005546954 NetworkManager[55665]: <info>  [1764938828.2908] manager: (tap3cfce5e6-e2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Dec  5 07:47:08 np0005546954 nova_compute[187160]: 2025-12-05 12:47:08.293 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:47:08 np0005546954 nova_compute[187160]: 2025-12-05 12:47:08.303 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:08 np0005546954 nova_compute[187160]: 2025-12-05 12:47:08.305 187164 INFO os_vif [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:eb:b8:79,bridge_name='br-int',has_traffic_filtering=True,id=3cfce5e6-e2b3-48b1-8c76-a31b5e3fc2b2,network=Network(2440340b-a243-4eda-a15d-71178a169a03),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cfce5e6-e2')#033[00m
Dec  5 07:47:08 np0005546954 nova_compute[187160]: 2025-12-05 12:47:08.306 187164 DEBUG nova.virt.libvirt.driver [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Dec  5 07:47:08 np0005546954 nova_compute[187160]: 2025-12-05 12:47:08.306 187164 DEBUG nova.compute.manager [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmptjoidb5d',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4ace353f-ec30-46cb-9906-7b66b0f752a6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Dec  5 07:47:09 np0005546954 nova_compute[187160]: 2025-12-05 12:47:09.230 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:11 np0005546954 nova_compute[187160]: 2025-12-05 12:47:11.244 187164 DEBUG nova.network.neutron [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4ace353f-ec30-46cb-9906-7b66b0f752a6] Port 3cfce5e6-e2b3-48b1-8c76-a31b5e3fc2b2 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Dec  5 07:47:11 np0005546954 nova_compute[187160]: 2025-12-05 12:47:11.247 187164 DEBUG nova.compute.manager [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmptjoidb5d',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4ace353f-ec30-46cb-9906-7b66b0f752a6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Dec  5 07:47:11 np0005546954 systemd[1]: Starting libvirt proxy daemon...
Dec  5 07:47:11 np0005546954 systemd[1]: Started libvirt proxy daemon.
Dec  5 07:47:11 np0005546954 podman[210497]: 2025-12-05 12:47:11.93102745 +0000 UTC m=+0.085323718 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true)
Dec  5 07:47:12 np0005546954 kernel: tap3cfce5e6-e2: entered promiscuous mode
Dec  5 07:47:12 np0005546954 ovn_controller[95566]: 2025-12-05T12:47:12Z|00073|binding|INFO|Claiming lport 3cfce5e6-e2b3-48b1-8c76-a31b5e3fc2b2 for this additional chassis.
Dec  5 07:47:12 np0005546954 ovn_controller[95566]: 2025-12-05T12:47:12Z|00074|binding|INFO|3cfce5e6-e2b3-48b1-8c76-a31b5e3fc2b2: Claiming fa:16:3e:eb:b8:79 10.100.0.7
Dec  5 07:47:12 np0005546954 nova_compute[187160]: 2025-12-05 12:47:12.016 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:12 np0005546954 NetworkManager[55665]: <info>  [1764938832.0196] manager: (tap3cfce5e6-e2): new Tun device (/org/freedesktop/NetworkManager/Devices/38)
Dec  5 07:47:12 np0005546954 ovn_controller[95566]: 2025-12-05T12:47:12Z|00075|binding|INFO|Setting lport 3cfce5e6-e2b3-48b1-8c76-a31b5e3fc2b2 ovn-installed in OVS
Dec  5 07:47:12 np0005546954 nova_compute[187160]: 2025-12-05 12:47:12.034 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:12 np0005546954 nova_compute[187160]: 2025-12-05 12:47:12.040 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:12 np0005546954 systemd-udevd[210546]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:47:12 np0005546954 systemd-machined[153497]: New machine qemu-7-instance-00000007.
Dec  5 07:47:12 np0005546954 NetworkManager[55665]: <info>  [1764938832.0677] device (tap3cfce5e6-e2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:47:12 np0005546954 NetworkManager[55665]: <info>  [1764938832.0686] device (tap3cfce5e6-e2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:47:12 np0005546954 systemd[1]: Started Virtual Machine qemu-7-instance-00000007.
Dec  5 07:47:12 np0005546954 nova_compute[187160]: 2025-12-05 12:47:12.666 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764938832.6653578, 4ace353f-ec30-46cb-9906-7b66b0f752a6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:47:12 np0005546954 nova_compute[187160]: 2025-12-05 12:47:12.666 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 4ace353f-ec30-46cb-9906-7b66b0f752a6] VM Started (Lifecycle Event)#033[00m
Dec  5 07:47:12 np0005546954 nova_compute[187160]: 2025-12-05 12:47:12.766 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 4ace353f-ec30-46cb-9906-7b66b0f752a6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:47:13 np0005546954 nova_compute[187160]: 2025-12-05 12:47:13.290 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:13 np0005546954 nova_compute[187160]: 2025-12-05 12:47:13.516 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764938833.515838, 4ace353f-ec30-46cb-9906-7b66b0f752a6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:47:13 np0005546954 nova_compute[187160]: 2025-12-05 12:47:13.517 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 4ace353f-ec30-46cb-9906-7b66b0f752a6] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:47:13 np0005546954 nova_compute[187160]: 2025-12-05 12:47:13.672 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 4ace353f-ec30-46cb-9906-7b66b0f752a6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:47:13 np0005546954 nova_compute[187160]: 2025-12-05 12:47:13.677 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 4ace353f-ec30-46cb-9906-7b66b0f752a6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:47:13 np0005546954 nova_compute[187160]: 2025-12-05 12:47:13.759 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 4ace353f-ec30-46cb-9906-7b66b0f752a6] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Dec  5 07:47:14 np0005546954 nova_compute[187160]: 2025-12-05 12:47:14.232 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:14 np0005546954 ovn_controller[95566]: 2025-12-05T12:47:14Z|00076|binding|INFO|Claiming lport 3cfce5e6-e2b3-48b1-8c76-a31b5e3fc2b2 for this chassis.
Dec  5 07:47:14 np0005546954 ovn_controller[95566]: 2025-12-05T12:47:14Z|00077|binding|INFO|3cfce5e6-e2b3-48b1-8c76-a31b5e3fc2b2: Claiming fa:16:3e:eb:b8:79 10.100.0.7
Dec  5 07:47:14 np0005546954 ovn_controller[95566]: 2025-12-05T12:47:14Z|00078|binding|INFO|Setting lport 3cfce5e6-e2b3-48b1-8c76-a31b5e3fc2b2 up in Southbound
Dec  5 07:47:14 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:14.954 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:eb:b8:79 10.100.0.7'], port_security=['fa:16:3e:eb:b8:79 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '4ace353f-ec30-46cb-9906-7b66b0f752a6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2440340b-a243-4eda-a15d-71178a169a03', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f3230f111af4a7b989f52dd95d9d57f', 'neutron:revision_number': '11', 'neutron:security_group_ids': '8f34e8a7-cd16-49df-a810-0667f963d3a4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b311c115-047d-4871-8652-57d6506f66be, chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=3cfce5e6-e2b3-48b1-8c76-a31b5e3fc2b2) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:47:14 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:14.956 104428 INFO neutron.agent.ovn.metadata.agent [-] Port 3cfce5e6-e2b3-48b1-8c76-a31b5e3fc2b2 in datapath 2440340b-a243-4eda-a15d-71178a169a03 bound to our chassis#033[00m
Dec  5 07:47:14 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:14.957 104428 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2440340b-a243-4eda-a15d-71178a169a03#033[00m
Dec  5 07:47:14 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:14.973 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[349f5496-003b-4d15-bb10-59d226adf999]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:47:15 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:15.011 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[e84287ff-d78b-4a2a-b0d4-43072d6001bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:47:15 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:15.016 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[ddf56984-ee48-4782-88c5-33bd7d83460d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:47:15 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:15.061 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[ca7c34b3-3e35-4074-ae41-20177bb15159]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:47:15 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:15.084 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[d2609274-ba15-4892-8488-c731a90ecce5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2440340b-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4d:f7:82'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 378366, 'reachable_time': 21487, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210579, 'error': None, 'target': 'ovnmeta-2440340b-a243-4eda-a15d-71178a169a03', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:47:15 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:15.111 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[a3b6c0b6-7841-4263-821f-3cdea12e94e3]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap2440340b-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 378379, 'tstamp': 378379}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210580, 'error': None, 'target': 'ovnmeta-2440340b-a243-4eda-a15d-71178a169a03', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap2440340b-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 378381, 'tstamp': 378381}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210580, 'error': None, 'target': 'ovnmeta-2440340b-a243-4eda-a15d-71178a169a03', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:47:15 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:15.114 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2440340b-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:47:15 np0005546954 nova_compute[187160]: 2025-12-05 12:47:15.116 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:15 np0005546954 nova_compute[187160]: 2025-12-05 12:47:15.118 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:15 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:15.118 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2440340b-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:47:15 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:15.119 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:47:15 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:15.119 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2440340b-a0, col_values=(('external_ids', {'iface-id': '6804ecdc-7520-4ec9-b54b-c1c436be63da'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:47:15 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:15.120 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:47:15 np0005546954 podman[210582]: 2025-12-05 12:47:15.589659905 +0000 UTC m=+0.080037762 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  5 07:47:15 np0005546954 podman[210581]: 2025-12-05 12:47:15.600277268 +0000 UTC m=+0.093961118 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  5 07:47:15 np0005546954 nova_compute[187160]: 2025-12-05 12:47:15.662 187164 INFO nova.compute.manager [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4ace353f-ec30-46cb-9906-7b66b0f752a6] Post operation of migration started#033[00m
Dec  5 07:47:15 np0005546954 nova_compute[187160]: 2025-12-05 12:47:15.899 187164 DEBUG oslo_concurrency.lockutils [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "refresh_cache-4ace353f-ec30-46cb-9906-7b66b0f752a6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:47:15 np0005546954 nova_compute[187160]: 2025-12-05 12:47:15.900 187164 DEBUG oslo_concurrency.lockutils [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquired lock "refresh_cache-4ace353f-ec30-46cb-9906-7b66b0f752a6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:47:15 np0005546954 nova_compute[187160]: 2025-12-05 12:47:15.900 187164 DEBUG nova.network.neutron [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4ace353f-ec30-46cb-9906-7b66b0f752a6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:47:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:16.943 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:47:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:16.943 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:47:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:16.944 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:47:16 np0005546954 nova_compute[187160]: 2025-12-05 12:47:16.970 187164 DEBUG nova.network.neutron [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4ace353f-ec30-46cb-9906-7b66b0f752a6] Updating instance_info_cache with network_info: [{"id": "3cfce5e6-e2b3-48b1-8c76-a31b5e3fc2b2", "address": "fa:16:3e:eb:b8:79", "network": {"id": "2440340b-a243-4eda-a15d-71178a169a03", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1967301454-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f3230f111af4a7b989f52dd95d9d57f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cfce5e6-e2", "ovs_interfaceid": "3cfce5e6-e2b3-48b1-8c76-a31b5e3fc2b2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:47:16 np0005546954 nova_compute[187160]: 2025-12-05 12:47:16.991 187164 DEBUG oslo_concurrency.lockutils [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Releasing lock "refresh_cache-4ace353f-ec30-46cb-9906-7b66b0f752a6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:47:17 np0005546954 nova_compute[187160]: 2025-12-05 12:47:17.006 187164 DEBUG oslo_concurrency.lockutils [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:47:17 np0005546954 nova_compute[187160]: 2025-12-05 12:47:17.007 187164 DEBUG oslo_concurrency.lockutils [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:47:17 np0005546954 nova_compute[187160]: 2025-12-05 12:47:17.007 187164 DEBUG oslo_concurrency.lockutils [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:47:17 np0005546954 nova_compute[187160]: 2025-12-05 12:47:17.014 187164 INFO nova.virt.libvirt.driver [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4ace353f-ec30-46cb-9906-7b66b0f752a6] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Dec  5 07:47:17 np0005546954 virtqemud[186730]: Domain id=7 name='instance-00000007' uuid=4ace353f-ec30-46cb-9906-7b66b0f752a6 is tainted: custom-monitor
Dec  5 07:47:18 np0005546954 nova_compute[187160]: 2025-12-05 12:47:18.023 187164 INFO nova.virt.libvirt.driver [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4ace353f-ec30-46cb-9906-7b66b0f752a6] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Dec  5 07:47:18 np0005546954 nova_compute[187160]: 2025-12-05 12:47:18.293 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:19 np0005546954 nova_compute[187160]: 2025-12-05 12:47:19.029 187164 INFO nova.virt.libvirt.driver [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4ace353f-ec30-46cb-9906-7b66b0f752a6] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Dec  5 07:47:19 np0005546954 nova_compute[187160]: 2025-12-05 12:47:19.036 187164 DEBUG nova.compute.manager [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4ace353f-ec30-46cb-9906-7b66b0f752a6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:47:19 np0005546954 nova_compute[187160]: 2025-12-05 12:47:19.059 187164 DEBUG nova.objects.instance [None req-6efa13b0-7cab-4a1b-b249-600235165d62 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4ace353f-ec30-46cb-9906-7b66b0f752a6] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Dec  5 07:47:19 np0005546954 nova_compute[187160]: 2025-12-05 12:47:19.234 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:47:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:47:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:47:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:47:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:47:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 07:47:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:47:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 07:47:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:47:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:47:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 07:47:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:47:23 np0005546954 nova_compute[187160]: 2025-12-05 12:47:23.041 187164 DEBUG oslo_concurrency.lockutils [None req-86a7948f-46bd-4100-aebe-c94a9d37a097 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Acquiring lock "c0f94a57-0be2-40c1-a4a8-5e04bbbb608d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:47:23 np0005546954 nova_compute[187160]: 2025-12-05 12:47:23.042 187164 DEBUG oslo_concurrency.lockutils [None req-86a7948f-46bd-4100-aebe-c94a9d37a097 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Lock "c0f94a57-0be2-40c1-a4a8-5e04bbbb608d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:47:23 np0005546954 nova_compute[187160]: 2025-12-05 12:47:23.043 187164 DEBUG oslo_concurrency.lockutils [None req-86a7948f-46bd-4100-aebe-c94a9d37a097 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Acquiring lock "c0f94a57-0be2-40c1-a4a8-5e04bbbb608d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:47:23 np0005546954 nova_compute[187160]: 2025-12-05 12:47:23.043 187164 DEBUG oslo_concurrency.lockutils [None req-86a7948f-46bd-4100-aebe-c94a9d37a097 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Lock "c0f94a57-0be2-40c1-a4a8-5e04bbbb608d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:47:23 np0005546954 nova_compute[187160]: 2025-12-05 12:47:23.043 187164 DEBUG oslo_concurrency.lockutils [None req-86a7948f-46bd-4100-aebe-c94a9d37a097 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Lock "c0f94a57-0be2-40c1-a4a8-5e04bbbb608d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:47:23 np0005546954 nova_compute[187160]: 2025-12-05 12:47:23.044 187164 INFO nova.compute.manager [None req-86a7948f-46bd-4100-aebe-c94a9d37a097 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Terminating instance#033[00m
Dec  5 07:47:23 np0005546954 nova_compute[187160]: 2025-12-05 12:47:23.045 187164 DEBUG nova.compute.manager [None req-86a7948f-46bd-4100-aebe-c94a9d37a097 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:47:23 np0005546954 kernel: tapebe57519-50 (unregistering): left promiscuous mode
Dec  5 07:47:23 np0005546954 NetworkManager[55665]: <info>  [1764938843.0703] device (tapebe57519-50): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:47:23 np0005546954 nova_compute[187160]: 2025-12-05 12:47:23.080 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:23 np0005546954 ovn_controller[95566]: 2025-12-05T12:47:23Z|00079|binding|INFO|Releasing lport ebe57519-501d-4a2e-be01-a8c24ea4af8f from this chassis (sb_readonly=0)
Dec  5 07:47:23 np0005546954 ovn_controller[95566]: 2025-12-05T12:47:23Z|00080|binding|INFO|Setting lport ebe57519-501d-4a2e-be01-a8c24ea4af8f down in Southbound
Dec  5 07:47:23 np0005546954 ovn_controller[95566]: 2025-12-05T12:47:23Z|00081|binding|INFO|Removing iface tapebe57519-50 ovn-installed in OVS
Dec  5 07:47:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:23.089 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4c:86:d3 10.100.0.10'], port_security=['fa:16:3e:4c:86:d3 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c0f94a57-0be2-40c1-a4a8-5e04bbbb608d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2440340b-a243-4eda-a15d-71178a169a03', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f3230f111af4a7b989f52dd95d9d57f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8f34e8a7-cd16-49df-a810-0667f963d3a4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b311c115-047d-4871-8652-57d6506f66be, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=ebe57519-501d-4a2e-be01-a8c24ea4af8f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:47:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:23.091 104428 INFO neutron.agent.ovn.metadata.agent [-] Port ebe57519-501d-4a2e-be01-a8c24ea4af8f in datapath 2440340b-a243-4eda-a15d-71178a169a03 unbound from our chassis#033[00m
Dec  5 07:47:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:23.093 104428 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2440340b-a243-4eda-a15d-71178a169a03#033[00m
Dec  5 07:47:23 np0005546954 nova_compute[187160]: 2025-12-05 12:47:23.106 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:23.112 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[b0fd2c0b-2635-4427-b116-f1a4dc079eca]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:47:23 np0005546954 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000008.scope: Deactivated successfully.
Dec  5 07:47:23 np0005546954 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000008.scope: Consumed 15.502s CPU time.
Dec  5 07:47:23 np0005546954 systemd-machined[153497]: Machine qemu-6-instance-00000008 terminated.
Dec  5 07:47:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:23.146 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[4ab2b84a-a9f8-4eae-a4cd-b3090c23ddb5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:47:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:23.150 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[9f052203-a1b1-49d4-bd19-a99a012e6253]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:47:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:23.181 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[dc5577da-f676-49be-a425-0f6cf04be388]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:47:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:23.197 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[f5823310-a9e8-4e5d-bca4-987df8037072]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2440340b-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4d:f7:82'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 378366, 'reachable_time': 21487, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210645, 'error': None, 'target': 'ovnmeta-2440340b-a243-4eda-a15d-71178a169a03', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:47:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:23.212 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[0b5721b5-0ad0-4c07-8ec5-7e0b06f23eec]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap2440340b-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 378379, 'tstamp': 378379}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210646, 'error': None, 'target': 'ovnmeta-2440340b-a243-4eda-a15d-71178a169a03', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap2440340b-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 378381, 'tstamp': 378381}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210646, 'error': None, 'target': 'ovnmeta-2440340b-a243-4eda-a15d-71178a169a03', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:47:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:23.214 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2440340b-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:47:23 np0005546954 nova_compute[187160]: 2025-12-05 12:47:23.216 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:23 np0005546954 nova_compute[187160]: 2025-12-05 12:47:23.221 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:23.223 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2440340b-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:47:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:23.224 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:47:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:23.224 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2440340b-a0, col_values=(('external_ids', {'iface-id': '6804ecdc-7520-4ec9-b54b-c1c436be63da'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:47:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:23.224 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:47:23 np0005546954 nova_compute[187160]: 2025-12-05 12:47:23.295 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:23 np0005546954 nova_compute[187160]: 2025-12-05 12:47:23.306 187164 INFO nova.virt.libvirt.driver [-] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Instance destroyed successfully.#033[00m
Dec  5 07:47:23 np0005546954 nova_compute[187160]: 2025-12-05 12:47:23.307 187164 DEBUG nova.objects.instance [None req-86a7948f-46bd-4100-aebe-c94a9d37a097 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Lazy-loading 'resources' on Instance uuid c0f94a57-0be2-40c1-a4a8-5e04bbbb608d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:47:23 np0005546954 nova_compute[187160]: 2025-12-05 12:47:23.324 187164 DEBUG nova.virt.libvirt.vif [None req-86a7948f-46bd-4100-aebe-c94a9d37a097 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:46:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-92849756',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-92849756',id=8,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:46:18Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8f3230f111af4a7b989f52dd95d9d57f',ramdisk_id='',reservation_id='r-495evnna',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-2049426111',owner_user_name='tempest-TestExecuteBasicStrategy-2049426111-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:46:18Z,user_data=None,user_id='a0f9474025d345019200ba286c9b5bf1',uuid=c0f94a57-0be2-40c1-a4a8-5e04bbbb608d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ebe57519-501d-4a2e-be01-a8c24ea4af8f", "address": "fa:16:3e:4c:86:d3", "network": {"id": "2440340b-a243-4eda-a15d-71178a169a03", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1967301454-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f3230f111af4a7b989f52dd95d9d57f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebe57519-50", "ovs_interfaceid": "ebe57519-501d-4a2e-be01-a8c24ea4af8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:47:23 np0005546954 nova_compute[187160]: 2025-12-05 12:47:23.324 187164 DEBUG nova.network.os_vif_util [None req-86a7948f-46bd-4100-aebe-c94a9d37a097 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Converting VIF {"id": "ebe57519-501d-4a2e-be01-a8c24ea4af8f", "address": "fa:16:3e:4c:86:d3", "network": {"id": "2440340b-a243-4eda-a15d-71178a169a03", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1967301454-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f3230f111af4a7b989f52dd95d9d57f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebe57519-50", "ovs_interfaceid": "ebe57519-501d-4a2e-be01-a8c24ea4af8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:47:23 np0005546954 nova_compute[187160]: 2025-12-05 12:47:23.325 187164 DEBUG nova.network.os_vif_util [None req-86a7948f-46bd-4100-aebe-c94a9d37a097 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4c:86:d3,bridge_name='br-int',has_traffic_filtering=True,id=ebe57519-501d-4a2e-be01-a8c24ea4af8f,network=Network(2440340b-a243-4eda-a15d-71178a169a03),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebe57519-50') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:47:23 np0005546954 nova_compute[187160]: 2025-12-05 12:47:23.325 187164 DEBUG os_vif [None req-86a7948f-46bd-4100-aebe-c94a9d37a097 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4c:86:d3,bridge_name='br-int',has_traffic_filtering=True,id=ebe57519-501d-4a2e-be01-a8c24ea4af8f,network=Network(2440340b-a243-4eda-a15d-71178a169a03),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebe57519-50') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:47:23 np0005546954 nova_compute[187160]: 2025-12-05 12:47:23.327 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:23 np0005546954 nova_compute[187160]: 2025-12-05 12:47:23.327 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapebe57519-50, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:47:23 np0005546954 nova_compute[187160]: 2025-12-05 12:47:23.329 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:23 np0005546954 nova_compute[187160]: 2025-12-05 12:47:23.332 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:23 np0005546954 nova_compute[187160]: 2025-12-05 12:47:23.337 187164 INFO os_vif [None req-86a7948f-46bd-4100-aebe-c94a9d37a097 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4c:86:d3,bridge_name='br-int',has_traffic_filtering=True,id=ebe57519-501d-4a2e-be01-a8c24ea4af8f,network=Network(2440340b-a243-4eda-a15d-71178a169a03),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebe57519-50')#033[00m
Dec  5 07:47:23 np0005546954 nova_compute[187160]: 2025-12-05 12:47:23.338 187164 INFO nova.virt.libvirt.driver [None req-86a7948f-46bd-4100-aebe-c94a9d37a097 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Deleting instance files /var/lib/nova/instances/c0f94a57-0be2-40c1-a4a8-5e04bbbb608d_del#033[00m
Dec  5 07:47:23 np0005546954 nova_compute[187160]: 2025-12-05 12:47:23.339 187164 INFO nova.virt.libvirt.driver [None req-86a7948f-46bd-4100-aebe-c94a9d37a097 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Deletion of /var/lib/nova/instances/c0f94a57-0be2-40c1-a4a8-5e04bbbb608d_del complete#033[00m
Dec  5 07:47:23 np0005546954 nova_compute[187160]: 2025-12-05 12:47:23.392 187164 INFO nova.compute.manager [None req-86a7948f-46bd-4100-aebe-c94a9d37a097 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Took 0.35 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:47:23 np0005546954 nova_compute[187160]: 2025-12-05 12:47:23.393 187164 DEBUG oslo.service.loopingcall [None req-86a7948f-46bd-4100-aebe-c94a9d37a097 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:47:23 np0005546954 nova_compute[187160]: 2025-12-05 12:47:23.393 187164 DEBUG nova.compute.manager [-] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:47:23 np0005546954 nova_compute[187160]: 2025-12-05 12:47:23.394 187164 DEBUG nova.network.neutron [-] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:47:23 np0005546954 nova_compute[187160]: 2025-12-05 12:47:23.552 187164 DEBUG nova.compute.manager [req-d5aef653-bcd3-45ba-9562-857a9f0bb459 req-36fb2e74-8d5c-462a-918e-e1d914eb9de6 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Received event network-vif-unplugged-ebe57519-501d-4a2e-be01-a8c24ea4af8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:47:23 np0005546954 nova_compute[187160]: 2025-12-05 12:47:23.553 187164 DEBUG oslo_concurrency.lockutils [req-d5aef653-bcd3-45ba-9562-857a9f0bb459 req-36fb2e74-8d5c-462a-918e-e1d914eb9de6 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "c0f94a57-0be2-40c1-a4a8-5e04bbbb608d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:47:23 np0005546954 nova_compute[187160]: 2025-12-05 12:47:23.553 187164 DEBUG oslo_concurrency.lockutils [req-d5aef653-bcd3-45ba-9562-857a9f0bb459 req-36fb2e74-8d5c-462a-918e-e1d914eb9de6 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "c0f94a57-0be2-40c1-a4a8-5e04bbbb608d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:47:23 np0005546954 nova_compute[187160]: 2025-12-05 12:47:23.554 187164 DEBUG oslo_concurrency.lockutils [req-d5aef653-bcd3-45ba-9562-857a9f0bb459 req-36fb2e74-8d5c-462a-918e-e1d914eb9de6 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "c0f94a57-0be2-40c1-a4a8-5e04bbbb608d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:47:23 np0005546954 nova_compute[187160]: 2025-12-05 12:47:23.554 187164 DEBUG nova.compute.manager [req-d5aef653-bcd3-45ba-9562-857a9f0bb459 req-36fb2e74-8d5c-462a-918e-e1d914eb9de6 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] No waiting events found dispatching network-vif-unplugged-ebe57519-501d-4a2e-be01-a8c24ea4af8f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:47:23 np0005546954 nova_compute[187160]: 2025-12-05 12:47:23.555 187164 DEBUG nova.compute.manager [req-d5aef653-bcd3-45ba-9562-857a9f0bb459 req-36fb2e74-8d5c-462a-918e-e1d914eb9de6 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Received event network-vif-unplugged-ebe57519-501d-4a2e-be01-a8c24ea4af8f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  5 07:47:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:23.787 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2a:56:46', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '32:90:88:ab:74:32'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:47:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:23.788 104428 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 07:47:23 np0005546954 nova_compute[187160]: 2025-12-05 12:47:23.789 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:24 np0005546954 nova_compute[187160]: 2025-12-05 12:47:24.017 187164 DEBUG nova.network.neutron [-] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:47:24 np0005546954 nova_compute[187160]: 2025-12-05 12:47:24.038 187164 INFO nova.compute.manager [-] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Took 0.64 seconds to deallocate network for instance.#033[00m
Dec  5 07:47:24 np0005546954 nova_compute[187160]: 2025-12-05 12:47:24.088 187164 DEBUG oslo_concurrency.lockutils [None req-86a7948f-46bd-4100-aebe-c94a9d37a097 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:47:24 np0005546954 nova_compute[187160]: 2025-12-05 12:47:24.088 187164 DEBUG oslo_concurrency.lockutils [None req-86a7948f-46bd-4100-aebe-c94a9d37a097 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:47:24 np0005546954 nova_compute[187160]: 2025-12-05 12:47:24.195 187164 DEBUG nova.compute.provider_tree [None req-86a7948f-46bd-4100-aebe-c94a9d37a097 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:47:24 np0005546954 nova_compute[187160]: 2025-12-05 12:47:24.227 187164 DEBUG nova.scheduler.client.report [None req-86a7948f-46bd-4100-aebe-c94a9d37a097 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:47:24 np0005546954 nova_compute[187160]: 2025-12-05 12:47:24.237 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:24 np0005546954 nova_compute[187160]: 2025-12-05 12:47:24.271 187164 DEBUG oslo_concurrency.lockutils [None req-86a7948f-46bd-4100-aebe-c94a9d37a097 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.182s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:47:24 np0005546954 nova_compute[187160]: 2025-12-05 12:47:24.302 187164 INFO nova.scheduler.client.report [None req-86a7948f-46bd-4100-aebe-c94a9d37a097 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Deleted allocations for instance c0f94a57-0be2-40c1-a4a8-5e04bbbb608d#033[00m
Dec  5 07:47:24 np0005546954 nova_compute[187160]: 2025-12-05 12:47:24.368 187164 DEBUG oslo_concurrency.lockutils [None req-86a7948f-46bd-4100-aebe-c94a9d37a097 a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Lock "c0f94a57-0be2-40c1-a4a8-5e04bbbb608d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.325s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:47:24 np0005546954 nova_compute[187160]: 2025-12-05 12:47:24.840 187164 DEBUG oslo_concurrency.lockutils [None req-63edfca0-a6ea-46e0-bfb6-b55f5c25064a a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Acquiring lock "4ace353f-ec30-46cb-9906-7b66b0f752a6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:47:24 np0005546954 nova_compute[187160]: 2025-12-05 12:47:24.841 187164 DEBUG oslo_concurrency.lockutils [None req-63edfca0-a6ea-46e0-bfb6-b55f5c25064a a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Lock "4ace353f-ec30-46cb-9906-7b66b0f752a6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:47:24 np0005546954 nova_compute[187160]: 2025-12-05 12:47:24.841 187164 DEBUG oslo_concurrency.lockutils [None req-63edfca0-a6ea-46e0-bfb6-b55f5c25064a a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Acquiring lock "4ace353f-ec30-46cb-9906-7b66b0f752a6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:47:24 np0005546954 nova_compute[187160]: 2025-12-05 12:47:24.841 187164 DEBUG oslo_concurrency.lockutils [None req-63edfca0-a6ea-46e0-bfb6-b55f5c25064a a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Lock "4ace353f-ec30-46cb-9906-7b66b0f752a6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:47:24 np0005546954 nova_compute[187160]: 2025-12-05 12:47:24.841 187164 DEBUG oslo_concurrency.lockutils [None req-63edfca0-a6ea-46e0-bfb6-b55f5c25064a a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Lock "4ace353f-ec30-46cb-9906-7b66b0f752a6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:47:24 np0005546954 nova_compute[187160]: 2025-12-05 12:47:24.843 187164 INFO nova.compute.manager [None req-63edfca0-a6ea-46e0-bfb6-b55f5c25064a a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] [instance: 4ace353f-ec30-46cb-9906-7b66b0f752a6] Terminating instance#033[00m
Dec  5 07:47:24 np0005546954 nova_compute[187160]: 2025-12-05 12:47:24.844 187164 DEBUG nova.compute.manager [None req-63edfca0-a6ea-46e0-bfb6-b55f5c25064a a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] [instance: 4ace353f-ec30-46cb-9906-7b66b0f752a6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:47:24 np0005546954 kernel: tap3cfce5e6-e2 (unregistering): left promiscuous mode
Dec  5 07:47:24 np0005546954 NetworkManager[55665]: <info>  [1764938844.8740] device (tap3cfce5e6-e2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:47:24 np0005546954 nova_compute[187160]: 2025-12-05 12:47:24.919 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:24 np0005546954 ovn_controller[95566]: 2025-12-05T12:47:24Z|00082|binding|INFO|Releasing lport 3cfce5e6-e2b3-48b1-8c76-a31b5e3fc2b2 from this chassis (sb_readonly=0)
Dec  5 07:47:24 np0005546954 ovn_controller[95566]: 2025-12-05T12:47:24Z|00083|binding|INFO|Setting lport 3cfce5e6-e2b3-48b1-8c76-a31b5e3fc2b2 down in Southbound
Dec  5 07:47:24 np0005546954 ovn_controller[95566]: 2025-12-05T12:47:24Z|00084|binding|INFO|Removing iface tap3cfce5e6-e2 ovn-installed in OVS
Dec  5 07:47:24 np0005546954 nova_compute[187160]: 2025-12-05 12:47:24.921 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:24 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:24.927 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:eb:b8:79 10.100.0.7'], port_security=['fa:16:3e:eb:b8:79 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '4ace353f-ec30-46cb-9906-7b66b0f752a6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2440340b-a243-4eda-a15d-71178a169a03', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f3230f111af4a7b989f52dd95d9d57f', 'neutron:revision_number': '13', 'neutron:security_group_ids': '8f34e8a7-cd16-49df-a810-0667f963d3a4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b311c115-047d-4871-8652-57d6506f66be, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=3cfce5e6-e2b3-48b1-8c76-a31b5e3fc2b2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:47:24 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:24.928 104428 INFO neutron.agent.ovn.metadata.agent [-] Port 3cfce5e6-e2b3-48b1-8c76-a31b5e3fc2b2 in datapath 2440340b-a243-4eda-a15d-71178a169a03 unbound from our chassis#033[00m
Dec  5 07:47:24 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:24.929 104428 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2440340b-a243-4eda-a15d-71178a169a03, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:47:24 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:24.930 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[61d46899-b13a-4792-8859-ecd1dde9460a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:47:24 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:24.930 104428 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2440340b-a243-4eda-a15d-71178a169a03 namespace which is not needed anymore#033[00m
Dec  5 07:47:24 np0005546954 nova_compute[187160]: 2025-12-05 12:47:24.934 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:24 np0005546954 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000007.scope: Deactivated successfully.
Dec  5 07:47:24 np0005546954 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000007.scope: Consumed 1.589s CPU time.
Dec  5 07:47:24 np0005546954 systemd-machined[153497]: Machine qemu-7-instance-00000007 terminated.
Dec  5 07:47:25 np0005546954 NetworkManager[55665]: <info>  [1764938845.0692] manager: (tap3cfce5e6-e2): new Tun device (/org/freedesktop/NetworkManager/Devices/39)
Dec  5 07:47:25 np0005546954 nova_compute[187160]: 2025-12-05 12:47:25.124 187164 INFO nova.virt.libvirt.driver [-] [instance: 4ace353f-ec30-46cb-9906-7b66b0f752a6] Instance destroyed successfully.#033[00m
Dec  5 07:47:25 np0005546954 nova_compute[187160]: 2025-12-05 12:47:25.125 187164 DEBUG nova.objects.instance [None req-63edfca0-a6ea-46e0-bfb6-b55f5c25064a a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Lazy-loading 'resources' on Instance uuid 4ace353f-ec30-46cb-9906-7b66b0f752a6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:47:25 np0005546954 nova_compute[187160]: 2025-12-05 12:47:25.144 187164 DEBUG nova.virt.libvirt.vif [None req-63edfca0-a6ea-46e0-bfb6-b55f5c25064a a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T12:45:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-1530812500',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-1530812500',id=7,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:46:00Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8f3230f111af4a7b989f52dd95d9d57f',ramdisk_id='',reservation_id='r-7gjdsrre',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-2049426111',owner_user_name='tempest-TestExecuteBasicStrategy-2049426111-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:47:19Z,user_data=None,user_id='a0f9474025d345019200ba286c9b5bf1',uuid=4ace353f-ec30-46cb-9906-7b66b0f752a6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3cfce5e6-e2b3-48b1-8c76-a31b5e3fc2b2", "address": "fa:16:3e:eb:b8:79", "network": {"id": "2440340b-a243-4eda-a15d-71178a169a03", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1967301454-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f3230f111af4a7b989f52dd95d9d57f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cfce5e6-e2", "ovs_interfaceid": "3cfce5e6-e2b3-48b1-8c76-a31b5e3fc2b2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:47:25 np0005546954 nova_compute[187160]: 2025-12-05 12:47:25.145 187164 DEBUG nova.network.os_vif_util [None req-63edfca0-a6ea-46e0-bfb6-b55f5c25064a a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Converting VIF {"id": "3cfce5e6-e2b3-48b1-8c76-a31b5e3fc2b2", "address": "fa:16:3e:eb:b8:79", "network": {"id": "2440340b-a243-4eda-a15d-71178a169a03", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1967301454-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f3230f111af4a7b989f52dd95d9d57f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cfce5e6-e2", "ovs_interfaceid": "3cfce5e6-e2b3-48b1-8c76-a31b5e3fc2b2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:47:25 np0005546954 nova_compute[187160]: 2025-12-05 12:47:25.145 187164 DEBUG nova.network.os_vif_util [None req-63edfca0-a6ea-46e0-bfb6-b55f5c25064a a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:eb:b8:79,bridge_name='br-int',has_traffic_filtering=True,id=3cfce5e6-e2b3-48b1-8c76-a31b5e3fc2b2,network=Network(2440340b-a243-4eda-a15d-71178a169a03),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cfce5e6-e2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:47:25 np0005546954 nova_compute[187160]: 2025-12-05 12:47:25.146 187164 DEBUG os_vif [None req-63edfca0-a6ea-46e0-bfb6-b55f5c25064a a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:eb:b8:79,bridge_name='br-int',has_traffic_filtering=True,id=3cfce5e6-e2b3-48b1-8c76-a31b5e3fc2b2,network=Network(2440340b-a243-4eda-a15d-71178a169a03),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cfce5e6-e2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:47:25 np0005546954 nova_compute[187160]: 2025-12-05 12:47:25.147 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:25 np0005546954 nova_compute[187160]: 2025-12-05 12:47:25.147 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3cfce5e6-e2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:47:25 np0005546954 nova_compute[187160]: 2025-12-05 12:47:25.149 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:25 np0005546954 nova_compute[187160]: 2025-12-05 12:47:25.151 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:47:25 np0005546954 nova_compute[187160]: 2025-12-05 12:47:25.155 187164 INFO os_vif [None req-63edfca0-a6ea-46e0-bfb6-b55f5c25064a a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:eb:b8:79,bridge_name='br-int',has_traffic_filtering=True,id=3cfce5e6-e2b3-48b1-8c76-a31b5e3fc2b2,network=Network(2440340b-a243-4eda-a15d-71178a169a03),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cfce5e6-e2')#033[00m
Dec  5 07:47:25 np0005546954 nova_compute[187160]: 2025-12-05 12:47:25.156 187164 INFO nova.virt.libvirt.driver [None req-63edfca0-a6ea-46e0-bfb6-b55f5c25064a a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] [instance: 4ace353f-ec30-46cb-9906-7b66b0f752a6] Deleting instance files /var/lib/nova/instances/4ace353f-ec30-46cb-9906-7b66b0f752a6_del#033[00m
Dec  5 07:47:25 np0005546954 nova_compute[187160]: 2025-12-05 12:47:25.156 187164 INFO nova.virt.libvirt.driver [None req-63edfca0-a6ea-46e0-bfb6-b55f5c25064a a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] [instance: 4ace353f-ec30-46cb-9906-7b66b0f752a6] Deletion of /var/lib/nova/instances/4ace353f-ec30-46cb-9906-7b66b0f752a6_del complete#033[00m
Dec  5 07:47:25 np0005546954 nova_compute[187160]: 2025-12-05 12:47:25.214 187164 INFO nova.compute.manager [None req-63edfca0-a6ea-46e0-bfb6-b55f5c25064a a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] [instance: 4ace353f-ec30-46cb-9906-7b66b0f752a6] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:47:25 np0005546954 nova_compute[187160]: 2025-12-05 12:47:25.215 187164 DEBUG oslo.service.loopingcall [None req-63edfca0-a6ea-46e0-bfb6-b55f5c25064a a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:47:25 np0005546954 nova_compute[187160]: 2025-12-05 12:47:25.216 187164 DEBUG nova.compute.manager [-] [instance: 4ace353f-ec30-46cb-9906-7b66b0f752a6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:47:25 np0005546954 nova_compute[187160]: 2025-12-05 12:47:25.216 187164 DEBUG nova.network.neutron [-] [instance: 4ace353f-ec30-46cb-9906-7b66b0f752a6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:47:25 np0005546954 neutron-haproxy-ovnmeta-2440340b-a243-4eda-a15d-71178a169a03[210285]: [NOTICE]   (210289) : haproxy version is 2.8.14-c23fe91
Dec  5 07:47:25 np0005546954 neutron-haproxy-ovnmeta-2440340b-a243-4eda-a15d-71178a169a03[210285]: [NOTICE]   (210289) : path to executable is /usr/sbin/haproxy
Dec  5 07:47:25 np0005546954 neutron-haproxy-ovnmeta-2440340b-a243-4eda-a15d-71178a169a03[210285]: [WARNING]  (210289) : Exiting Master process...
Dec  5 07:47:25 np0005546954 neutron-haproxy-ovnmeta-2440340b-a243-4eda-a15d-71178a169a03[210285]: [WARNING]  (210289) : Exiting Master process...
Dec  5 07:47:25 np0005546954 neutron-haproxy-ovnmeta-2440340b-a243-4eda-a15d-71178a169a03[210285]: [ALERT]    (210289) : Current worker (210291) exited with code 143 (Terminated)
Dec  5 07:47:25 np0005546954 neutron-haproxy-ovnmeta-2440340b-a243-4eda-a15d-71178a169a03[210285]: [WARNING]  (210289) : All workers exited. Exiting... (0)
Dec  5 07:47:25 np0005546954 systemd[1]: libpod-c3e95528b64d59db131bfe7c2defbc0d1465fd9498653ea786ffcafe4cddbbff.scope: Deactivated successfully.
Dec  5 07:47:25 np0005546954 podman[210687]: 2025-12-05 12:47:25.261522828 +0000 UTC m=+0.238287686 container died c3e95528b64d59db131bfe7c2defbc0d1465fd9498653ea786ffcafe4cddbbff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2440340b-a243-4eda-a15d-71178a169a03, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec  5 07:47:25 np0005546954 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c3e95528b64d59db131bfe7c2defbc0d1465fd9498653ea786ffcafe4cddbbff-userdata-shm.mount: Deactivated successfully.
Dec  5 07:47:25 np0005546954 systemd[1]: var-lib-containers-storage-overlay-5bb38938fb4079d52dc35ec20d1a5122f5d6a17b332a4128ca4a0179d79a4200-merged.mount: Deactivated successfully.
Dec  5 07:47:25 np0005546954 podman[210687]: 2025-12-05 12:47:25.380617645 +0000 UTC m=+0.357382463 container cleanup c3e95528b64d59db131bfe7c2defbc0d1465fd9498653ea786ffcafe4cddbbff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2440340b-a243-4eda-a15d-71178a169a03, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  5 07:47:25 np0005546954 systemd[1]: libpod-conmon-c3e95528b64d59db131bfe7c2defbc0d1465fd9498653ea786ffcafe4cddbbff.scope: Deactivated successfully.
Dec  5 07:47:25 np0005546954 podman[210735]: 2025-12-05 12:47:25.461370919 +0000 UTC m=+0.047373688 container remove c3e95528b64d59db131bfe7c2defbc0d1465fd9498653ea786ffcafe4cddbbff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2440340b-a243-4eda-a15d-71178a169a03, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  5 07:47:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:25.467 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[1387f602-b252-42af-a74f-58f554845c3e]: (4, ('Fri Dec  5 12:47:25 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-2440340b-a243-4eda-a15d-71178a169a03 (c3e95528b64d59db131bfe7c2defbc0d1465fd9498653ea786ffcafe4cddbbff)\nc3e95528b64d59db131bfe7c2defbc0d1465fd9498653ea786ffcafe4cddbbff\nFri Dec  5 12:47:25 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-2440340b-a243-4eda-a15d-71178a169a03 (c3e95528b64d59db131bfe7c2defbc0d1465fd9498653ea786ffcafe4cddbbff)\nc3e95528b64d59db131bfe7c2defbc0d1465fd9498653ea786ffcafe4cddbbff\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:47:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:25.469 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[a0a979a2-d4d1-4a66-b8eb-049309060c3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:47:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:25.469 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2440340b-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:47:25 np0005546954 nova_compute[187160]: 2025-12-05 12:47:25.471 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:25 np0005546954 kernel: tap2440340b-a0: left promiscuous mode
Dec  5 07:47:25 np0005546954 nova_compute[187160]: 2025-12-05 12:47:25.496 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:25.499 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[75285a21-1cc4-4737-94a8-24d6f0516076]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:47:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:25.521 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[bd1397d7-bbec-45c9-9c90-488d375e5b25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:47:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:25.523 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[4ae2dd38-4f51-4cf9-9118-0e60b644d914]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:47:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:25.543 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[131ba5b3-37d9-4dd1-b367-691e742849fb]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 378357, 'reachable_time': 25843, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210750, 'error': None, 'target': 'ovnmeta-2440340b-a243-4eda-a15d-71178a169a03', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:47:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:25.547 104542 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2440340b-a243-4eda-a15d-71178a169a03 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:47:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:25.548 104542 DEBUG oslo.privsep.daemon [-] privsep: reply[e1a88e6b-5bc3-4fed-9fd2-0cf310e0aa57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:47:25 np0005546954 systemd[1]: run-netns-ovnmeta\x2d2440340b\x2da243\x2d4eda\x2da15d\x2d71178a169a03.mount: Deactivated successfully.
Dec  5 07:47:25 np0005546954 nova_compute[187160]: 2025-12-05 12:47:25.645 187164 DEBUG nova.compute.manager [req-5db468e7-bfc7-4a9d-8f53-29ca8dfea776 req-0e37efba-7f88-4470-86e0-9b738436e87c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Received event network-vif-plugged-ebe57519-501d-4a2e-be01-a8c24ea4af8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:47:25 np0005546954 nova_compute[187160]: 2025-12-05 12:47:25.645 187164 DEBUG oslo_concurrency.lockutils [req-5db468e7-bfc7-4a9d-8f53-29ca8dfea776 req-0e37efba-7f88-4470-86e0-9b738436e87c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "c0f94a57-0be2-40c1-a4a8-5e04bbbb608d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:47:25 np0005546954 nova_compute[187160]: 2025-12-05 12:47:25.645 187164 DEBUG oslo_concurrency.lockutils [req-5db468e7-bfc7-4a9d-8f53-29ca8dfea776 req-0e37efba-7f88-4470-86e0-9b738436e87c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "c0f94a57-0be2-40c1-a4a8-5e04bbbb608d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:47:25 np0005546954 nova_compute[187160]: 2025-12-05 12:47:25.646 187164 DEBUG oslo_concurrency.lockutils [req-5db468e7-bfc7-4a9d-8f53-29ca8dfea776 req-0e37efba-7f88-4470-86e0-9b738436e87c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "c0f94a57-0be2-40c1-a4a8-5e04bbbb608d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:47:25 np0005546954 nova_compute[187160]: 2025-12-05 12:47:25.646 187164 DEBUG nova.compute.manager [req-5db468e7-bfc7-4a9d-8f53-29ca8dfea776 req-0e37efba-7f88-4470-86e0-9b738436e87c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] No waiting events found dispatching network-vif-plugged-ebe57519-501d-4a2e-be01-a8c24ea4af8f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:47:25 np0005546954 nova_compute[187160]: 2025-12-05 12:47:25.646 187164 WARNING nova.compute.manager [req-5db468e7-bfc7-4a9d-8f53-29ca8dfea776 req-0e37efba-7f88-4470-86e0-9b738436e87c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Received unexpected event network-vif-plugged-ebe57519-501d-4a2e-be01-a8c24ea4af8f for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:47:25 np0005546954 nova_compute[187160]: 2025-12-05 12:47:25.646 187164 DEBUG nova.compute.manager [req-5db468e7-bfc7-4a9d-8f53-29ca8dfea776 req-0e37efba-7f88-4470-86e0-9b738436e87c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Received event network-vif-deleted-ebe57519-501d-4a2e-be01-a8c24ea4af8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:47:25 np0005546954 nova_compute[187160]: 2025-12-05 12:47:25.647 187164 DEBUG nova.compute.manager [req-5db468e7-bfc7-4a9d-8f53-29ca8dfea776 req-0e37efba-7f88-4470-86e0-9b738436e87c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4ace353f-ec30-46cb-9906-7b66b0f752a6] Received event network-vif-unplugged-3cfce5e6-e2b3-48b1-8c76-a31b5e3fc2b2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:47:25 np0005546954 nova_compute[187160]: 2025-12-05 12:47:25.647 187164 DEBUG oslo_concurrency.lockutils [req-5db468e7-bfc7-4a9d-8f53-29ca8dfea776 req-0e37efba-7f88-4470-86e0-9b738436e87c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "4ace353f-ec30-46cb-9906-7b66b0f752a6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:47:25 np0005546954 nova_compute[187160]: 2025-12-05 12:47:25.647 187164 DEBUG oslo_concurrency.lockutils [req-5db468e7-bfc7-4a9d-8f53-29ca8dfea776 req-0e37efba-7f88-4470-86e0-9b738436e87c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "4ace353f-ec30-46cb-9906-7b66b0f752a6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:47:25 np0005546954 nova_compute[187160]: 2025-12-05 12:47:25.647 187164 DEBUG oslo_concurrency.lockutils [req-5db468e7-bfc7-4a9d-8f53-29ca8dfea776 req-0e37efba-7f88-4470-86e0-9b738436e87c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "4ace353f-ec30-46cb-9906-7b66b0f752a6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:47:25 np0005546954 nova_compute[187160]: 2025-12-05 12:47:25.647 187164 DEBUG nova.compute.manager [req-5db468e7-bfc7-4a9d-8f53-29ca8dfea776 req-0e37efba-7f88-4470-86e0-9b738436e87c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4ace353f-ec30-46cb-9906-7b66b0f752a6] No waiting events found dispatching network-vif-unplugged-3cfce5e6-e2b3-48b1-8c76-a31b5e3fc2b2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:47:25 np0005546954 nova_compute[187160]: 2025-12-05 12:47:25.647 187164 DEBUG nova.compute.manager [req-5db468e7-bfc7-4a9d-8f53-29ca8dfea776 req-0e37efba-7f88-4470-86e0-9b738436e87c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4ace353f-ec30-46cb-9906-7b66b0f752a6] Received event network-vif-unplugged-3cfce5e6-e2b3-48b1-8c76-a31b5e3fc2b2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  5 07:47:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:47:25.790 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f9f74c-08f9-451f-9678-93bb9e8fa2fe, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:47:25 np0005546954 nova_compute[187160]: 2025-12-05 12:47:25.891 187164 DEBUG nova.network.neutron [-] [instance: 4ace353f-ec30-46cb-9906-7b66b0f752a6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:47:25 np0005546954 nova_compute[187160]: 2025-12-05 12:47:25.906 187164 INFO nova.compute.manager [-] [instance: 4ace353f-ec30-46cb-9906-7b66b0f752a6] Took 0.69 seconds to deallocate network for instance.#033[00m
Dec  5 07:47:25 np0005546954 nova_compute[187160]: 2025-12-05 12:47:25.954 187164 DEBUG oslo_concurrency.lockutils [None req-63edfca0-a6ea-46e0-bfb6-b55f5c25064a a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:47:25 np0005546954 nova_compute[187160]: 2025-12-05 12:47:25.954 187164 DEBUG oslo_concurrency.lockutils [None req-63edfca0-a6ea-46e0-bfb6-b55f5c25064a a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:47:25 np0005546954 nova_compute[187160]: 2025-12-05 12:47:25.966 187164 DEBUG oslo_concurrency.lockutils [None req-63edfca0-a6ea-46e0-bfb6-b55f5c25064a a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.012s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:47:26 np0005546954 nova_compute[187160]: 2025-12-05 12:47:26.189 187164 INFO nova.scheduler.client.report [None req-63edfca0-a6ea-46e0-bfb6-b55f5c25064a a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Deleted allocations for instance 4ace353f-ec30-46cb-9906-7b66b0f752a6#033[00m
Dec  5 07:47:27 np0005546954 nova_compute[187160]: 2025-12-05 12:47:27.135 187164 DEBUG oslo_concurrency.lockutils [None req-63edfca0-a6ea-46e0-bfb6-b55f5c25064a a0f9474025d345019200ba286c9b5bf1 8f3230f111af4a7b989f52dd95d9d57f - - default default] Lock "4ace353f-ec30-46cb-9906-7b66b0f752a6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.294s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:47:27 np0005546954 nova_compute[187160]: 2025-12-05 12:47:27.725 187164 DEBUG nova.compute.manager [req-53638484-0787-4af0-bd66-1c3b29d7424a req-b6f1e4c2-50cc-42f9-a4cb-2e4c76dbb35f 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4ace353f-ec30-46cb-9906-7b66b0f752a6] Received event network-vif-plugged-3cfce5e6-e2b3-48b1-8c76-a31b5e3fc2b2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:47:27 np0005546954 nova_compute[187160]: 2025-12-05 12:47:27.726 187164 DEBUG oslo_concurrency.lockutils [req-53638484-0787-4af0-bd66-1c3b29d7424a req-b6f1e4c2-50cc-42f9-a4cb-2e4c76dbb35f 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "4ace353f-ec30-46cb-9906-7b66b0f752a6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:47:27 np0005546954 nova_compute[187160]: 2025-12-05 12:47:27.726 187164 DEBUG oslo_concurrency.lockutils [req-53638484-0787-4af0-bd66-1c3b29d7424a req-b6f1e4c2-50cc-42f9-a4cb-2e4c76dbb35f 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "4ace353f-ec30-46cb-9906-7b66b0f752a6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:47:27 np0005546954 nova_compute[187160]: 2025-12-05 12:47:27.727 187164 DEBUG oslo_concurrency.lockutils [req-53638484-0787-4af0-bd66-1c3b29d7424a req-b6f1e4c2-50cc-42f9-a4cb-2e4c76dbb35f 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "4ace353f-ec30-46cb-9906-7b66b0f752a6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:47:27 np0005546954 nova_compute[187160]: 2025-12-05 12:47:27.727 187164 DEBUG nova.compute.manager [req-53638484-0787-4af0-bd66-1c3b29d7424a req-b6f1e4c2-50cc-42f9-a4cb-2e4c76dbb35f 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4ace353f-ec30-46cb-9906-7b66b0f752a6] No waiting events found dispatching network-vif-plugged-3cfce5e6-e2b3-48b1-8c76-a31b5e3fc2b2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:47:27 np0005546954 nova_compute[187160]: 2025-12-05 12:47:27.728 187164 WARNING nova.compute.manager [req-53638484-0787-4af0-bd66-1c3b29d7424a req-b6f1e4c2-50cc-42f9-a4cb-2e4c76dbb35f 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4ace353f-ec30-46cb-9906-7b66b0f752a6] Received unexpected event network-vif-plugged-3cfce5e6-e2b3-48b1-8c76-a31b5e3fc2b2 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:47:27 np0005546954 nova_compute[187160]: 2025-12-05 12:47:27.728 187164 DEBUG nova.compute.manager [req-53638484-0787-4af0-bd66-1c3b29d7424a req-b6f1e4c2-50cc-42f9-a4cb-2e4c76dbb35f 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4ace353f-ec30-46cb-9906-7b66b0f752a6] Received event network-vif-deleted-3cfce5e6-e2b3-48b1-8c76-a31b5e3fc2b2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:47:28 np0005546954 podman[210751]: 2025-12-05 12:47:28.56398994 +0000 UTC m=+0.076184211 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.33.7, version=9.6, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., release=1755695350)
Dec  5 07:47:28 np0005546954 podman[210752]: 2025-12-05 12:47:28.571301649 +0000 UTC m=+0.076379356 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.build-date=20251125, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec  5 07:47:29 np0005546954 nova_compute[187160]: 2025-12-05 12:47:29.239 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:30 np0005546954 nova_compute[187160]: 2025-12-05 12:47:30.153 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:34 np0005546954 nova_compute[187160]: 2025-12-05 12:47:34.242 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:35 np0005546954 nova_compute[187160]: 2025-12-05 12:47:35.157 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:35 np0005546954 podman[197513]: time="2025-12-05T12:47:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:47:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:47:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 07:47:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:47:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2589 "" "Go-http-client/1.1"
Dec  5 07:47:38 np0005546954 nova_compute[187160]: 2025-12-05 12:47:38.305 187164 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764938843.304299, c0f94a57-0be2-40c1-a4a8-5e04bbbb608d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:47:38 np0005546954 nova_compute[187160]: 2025-12-05 12:47:38.306 187164 INFO nova.compute.manager [-] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:47:38 np0005546954 nova_compute[187160]: 2025-12-05 12:47:38.333 187164 DEBUG nova.compute.manager [None req-0a071810-7d1c-4844-9dd0-54cda6f6df84 - - - - - -] [instance: c0f94a57-0be2-40c1-a4a8-5e04bbbb608d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:47:39 np0005546954 nova_compute[187160]: 2025-12-05 12:47:39.244 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:40 np0005546954 nova_compute[187160]: 2025-12-05 12:47:40.123 187164 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764938845.1217985, 4ace353f-ec30-46cb-9906-7b66b0f752a6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:47:40 np0005546954 nova_compute[187160]: 2025-12-05 12:47:40.124 187164 INFO nova.compute.manager [-] [instance: 4ace353f-ec30-46cb-9906-7b66b0f752a6] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:47:40 np0005546954 nova_compute[187160]: 2025-12-05 12:47:40.160 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:40 np0005546954 nova_compute[187160]: 2025-12-05 12:47:40.288 187164 DEBUG nova.compute.manager [None req-8b529f0b-fa4f-4905-be7d-b838bf881ae4 - - - - - -] [instance: 4ace353f-ec30-46cb-9906-7b66b0f752a6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:47:42 np0005546954 podman[210792]: 2025-12-05 12:47:42.549463138 +0000 UTC m=+0.057785253 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true)
Dec  5 07:47:44 np0005546954 nova_compute[187160]: 2025-12-05 12:47:44.247 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:45 np0005546954 nova_compute[187160]: 2025-12-05 12:47:45.164 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:46 np0005546954 podman[210813]: 2025-12-05 12:47:46.552368655 +0000 UTC m=+0.059167137 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 07:47:46 np0005546954 podman[210812]: 2025-12-05 12:47:46.590294015 +0000 UTC m=+0.100814144 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  5 07:47:49 np0005546954 nova_compute[187160]: 2025-12-05 12:47:49.247 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:47:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:47:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:47:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 07:47:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:47:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:47:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:47:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 07:47:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:47:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:47:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 07:47:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:47:50 np0005546954 nova_compute[187160]: 2025-12-05 12:47:50.168 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:53 np0005546954 nova_compute[187160]: 2025-12-05 12:47:53.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:47:53 np0005546954 nova_compute[187160]: 2025-12-05 12:47:53.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:47:53 np0005546954 nova_compute[187160]: 2025-12-05 12:47:53.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:47:53 np0005546954 nova_compute[187160]: 2025-12-05 12:47:53.074 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 07:47:54 np0005546954 nova_compute[187160]: 2025-12-05 12:47:54.251 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:55 np0005546954 nova_compute[187160]: 2025-12-05 12:47:55.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:47:55 np0005546954 nova_compute[187160]: 2025-12-05 12:47:55.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:47:55 np0005546954 nova_compute[187160]: 2025-12-05 12:47:55.170 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:55 np0005546954 ovn_controller[95566]: 2025-12-05T12:47:55Z|00085|memory_trim|INFO|Detected inactivity (last active 30013 ms ago): trimming memory
Dec  5 07:47:56 np0005546954 nova_compute[187160]: 2025-12-05 12:47:56.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:47:57 np0005546954 nova_compute[187160]: 2025-12-05 12:47:57.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:47:57 np0005546954 nova_compute[187160]: 2025-12-05 12:47:57.039 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:47:58 np0005546954 nova_compute[187160]: 2025-12-05 12:47:58.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:47:58 np0005546954 nova_compute[187160]: 2025-12-05 12:47:58.164 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:47:58 np0005546954 nova_compute[187160]: 2025-12-05 12:47:58.165 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:47:58 np0005546954 nova_compute[187160]: 2025-12-05 12:47:58.165 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:47:58 np0005546954 nova_compute[187160]: 2025-12-05 12:47:58.166 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:47:58 np0005546954 nova_compute[187160]: 2025-12-05 12:47:58.372 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:47:58 np0005546954 nova_compute[187160]: 2025-12-05 12:47:58.373 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5878MB free_disk=73.34021759033203GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:47:58 np0005546954 nova_compute[187160]: 2025-12-05 12:47:58.373 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:47:58 np0005546954 nova_compute[187160]: 2025-12-05 12:47:58.374 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:47:58 np0005546954 nova_compute[187160]: 2025-12-05 12:47:58.447 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:47:58 np0005546954 nova_compute[187160]: 2025-12-05 12:47:58.447 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:47:58 np0005546954 nova_compute[187160]: 2025-12-05 12:47:58.470 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:47:58 np0005546954 nova_compute[187160]: 2025-12-05 12:47:58.486 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:47:58 np0005546954 nova_compute[187160]: 2025-12-05 12:47:58.505 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:47:58 np0005546954 nova_compute[187160]: 2025-12-05 12:47:58.505 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:47:59 np0005546954 nova_compute[187160]: 2025-12-05 12:47:59.251 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:47:59 np0005546954 nova_compute[187160]: 2025-12-05 12:47:59.499 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:47:59 np0005546954 podman[210863]: 2025-12-05 12:47:59.569657618 +0000 UTC m=+0.076536133 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, vendor=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.buildah.version=1.33.7, io.openshift.expose-services=, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-type=git, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, architecture=x86_64, release=1755695350, com.redhat.component=ubi9-minimal-container)
Dec  5 07:47:59 np0005546954 podman[210864]: 2025-12-05 12:47:59.591453051 +0000 UTC m=+0.087404593 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  5 07:48:00 np0005546954 nova_compute[187160]: 2025-12-05 12:48:00.172 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:48:01 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:48:01.720 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2a:56:46', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '32:90:88:ab:74:32'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:48:01 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:48:01.722 104428 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 07:48:01 np0005546954 nova_compute[187160]: 2025-12-05 12:48:01.723 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:48:01 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:48:01.724 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f9f74c-08f9-451f-9678-93bb9e8fa2fe, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:48:02 np0005546954 nova_compute[187160]: 2025-12-05 12:48:02.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:48:03 np0005546954 nova_compute[187160]: 2025-12-05 12:48:03.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:48:03 np0005546954 nova_compute[187160]: 2025-12-05 12:48:03.657 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:48:04 np0005546954 nova_compute[187160]: 2025-12-05 12:48:04.284 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:48:05 np0005546954 nova_compute[187160]: 2025-12-05 12:48:05.175 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:48:05 np0005546954 podman[197513]: time="2025-12-05T12:48:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:48:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:48:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 07:48:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:48:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2587 "" "Go-http-client/1.1"
Dec  5 07:48:09 np0005546954 nova_compute[187160]: 2025-12-05 12:48:09.287 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:48:10 np0005546954 nova_compute[187160]: 2025-12-05 12:48:10.177 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:48:13 np0005546954 podman[210907]: 2025-12-05 12:48:13.547278531 +0000 UTC m=+0.055636356 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent)
Dec  5 07:48:14 np0005546954 nova_compute[187160]: 2025-12-05 12:48:14.289 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:48:15 np0005546954 nova_compute[187160]: 2025-12-05 12:48:15.179 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:48:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:48:16.944 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:48:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:48:16.945 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:48:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:48:16.945 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:48:17 np0005546954 podman[210929]: 2025-12-05 12:48:17.587484147 +0000 UTC m=+0.071030900 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec  5 07:48:17 np0005546954 podman[210928]: 2025-12-05 12:48:17.612948696 +0000 UTC m=+0.110659373 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  5 07:48:19 np0005546954 nova_compute[187160]: 2025-12-05 12:48:19.290 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:48:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:48:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 07:48:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:48:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:48:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:48:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:48:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:48:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 07:48:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:48:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:48:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 07:48:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:48:20 np0005546954 nova_compute[187160]: 2025-12-05 12:48:20.181 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:48:24 np0005546954 nova_compute[187160]: 2025-12-05 12:48:24.293 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:48:25 np0005546954 nova_compute[187160]: 2025-12-05 12:48:25.185 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:48:29 np0005546954 nova_compute[187160]: 2025-12-05 12:48:29.295 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:48:30 np0005546954 nova_compute[187160]: 2025-12-05 12:48:30.187 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:48:30 np0005546954 podman[210977]: 2025-12-05 12:48:30.554034158 +0000 UTC m=+0.064211006 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, container_name=openstack_network_exporter, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git)
Dec  5 07:48:30 np0005546954 podman[210978]: 2025-12-05 12:48:30.591411621 +0000 UTC m=+0.090059247 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3)
Dec  5 07:48:34 np0005546954 nova_compute[187160]: 2025-12-05 12:48:34.300 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:48:35 np0005546954 nova_compute[187160]: 2025-12-05 12:48:35.192 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:48:35 np0005546954 podman[197513]: time="2025-12-05T12:48:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:48:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:48:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 07:48:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:48:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2587 "" "Go-http-client/1.1"
Dec  5 07:48:37 np0005546954 ovn_controller[95566]: 2025-12-05T12:48:37Z|00086|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Dec  5 07:48:39 np0005546954 nova_compute[187160]: 2025-12-05 12:48:39.303 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:48:40 np0005546954 nova_compute[187160]: 2025-12-05 12:48:40.195 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:48:44 np0005546954 nova_compute[187160]: 2025-12-05 12:48:44.307 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:48:44 np0005546954 podman[211020]: 2025-12-05 12:48:44.591844948 +0000 UTC m=+0.087418204 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:48:45 np0005546954 nova_compute[187160]: 2025-12-05 12:48:45.198 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:48:48 np0005546954 podman[211040]: 2025-12-05 12:48:48.544785048 +0000 UTC m=+0.055886004 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 07:48:48 np0005546954 podman[211039]: 2025-12-05 12:48:48.622235308 +0000 UTC m=+0.138070382 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:48:49 np0005546954 nova_compute[187160]: 2025-12-05 12:48:49.309 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:48:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:48:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 07:48:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:48:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:48:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:48:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:48:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:48:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 07:48:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:48:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:48:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 07:48:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:48:50 np0005546954 nova_compute[187160]: 2025-12-05 12:48:50.201 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:48:53 np0005546954 nova_compute[187160]: 2025-12-05 12:48:53.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:48:53 np0005546954 nova_compute[187160]: 2025-12-05 12:48:53.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:48:53 np0005546954 nova_compute[187160]: 2025-12-05 12:48:53.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:48:53 np0005546954 nova_compute[187160]: 2025-12-05 12:48:53.057 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 07:48:54 np0005546954 nova_compute[187160]: 2025-12-05 12:48:54.313 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:48:55 np0005546954 nova_compute[187160]: 2025-12-05 12:48:55.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:48:55 np0005546954 nova_compute[187160]: 2025-12-05 12:48:55.203 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:48:56 np0005546954 nova_compute[187160]: 2025-12-05 12:48:56.035 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:48:56 np0005546954 nova_compute[187160]: 2025-12-05 12:48:56.051 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:48:58 np0005546954 nova_compute[187160]: 2025-12-05 12:48:58.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:48:59 np0005546954 nova_compute[187160]: 2025-12-05 12:48:59.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:48:59 np0005546954 nova_compute[187160]: 2025-12-05 12:48:59.039 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:48:59 np0005546954 nova_compute[187160]: 2025-12-05 12:48:59.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:48:59 np0005546954 nova_compute[187160]: 2025-12-05 12:48:59.294 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:48:59 np0005546954 nova_compute[187160]: 2025-12-05 12:48:59.295 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:48:59 np0005546954 nova_compute[187160]: 2025-12-05 12:48:59.295 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:48:59 np0005546954 nova_compute[187160]: 2025-12-05 12:48:59.295 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:48:59 np0005546954 nova_compute[187160]: 2025-12-05 12:48:59.315 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:48:59 np0005546954 nova_compute[187160]: 2025-12-05 12:48:59.475 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:48:59 np0005546954 nova_compute[187160]: 2025-12-05 12:48:59.477 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5877MB free_disk=73.33633804321289GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:48:59 np0005546954 nova_compute[187160]: 2025-12-05 12:48:59.477 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:48:59 np0005546954 nova_compute[187160]: 2025-12-05 12:48:59.477 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:48:59 np0005546954 nova_compute[187160]: 2025-12-05 12:48:59.555 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:48:59 np0005546954 nova_compute[187160]: 2025-12-05 12:48:59.555 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:48:59 np0005546954 nova_compute[187160]: 2025-12-05 12:48:59.593 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Refreshing inventories for resource provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  5 07:48:59 np0005546954 nova_compute[187160]: 2025-12-05 12:48:59.618 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Updating ProviderTree inventory for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  5 07:48:59 np0005546954 nova_compute[187160]: 2025-12-05 12:48:59.619 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Updating inventory in ProviderTree for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  5 07:48:59 np0005546954 nova_compute[187160]: 2025-12-05 12:48:59.632 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Refreshing aggregate associations for resource provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  5 07:48:59 np0005546954 nova_compute[187160]: 2025-12-05 12:48:59.651 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Refreshing trait associations for resource provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b, traits: COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_IDE,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_2_0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  5 07:48:59 np0005546954 nova_compute[187160]: 2025-12-05 12:48:59.676 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:48:59 np0005546954 nova_compute[187160]: 2025-12-05 12:48:59.702 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:48:59 np0005546954 nova_compute[187160]: 2025-12-05 12:48:59.704 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:48:59 np0005546954 nova_compute[187160]: 2025-12-05 12:48:59.705 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.228s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:49:00 np0005546954 nova_compute[187160]: 2025-12-05 12:49:00.206 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:49:01 np0005546954 podman[211093]: 2025-12-05 12:49:01.589548128 +0000 UTC m=+0.080196411 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd)
Dec  5 07:49:01 np0005546954 podman[211092]: 2025-12-05 12:49:01.58962832 +0000 UTC m=+0.081097680 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec  5 07:49:01 np0005546954 nova_compute[187160]: 2025-12-05 12:49:01.701 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:49:02 np0005546954 nova_compute[187160]: 2025-12-05 12:49:02.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:49:04 np0005546954 nova_compute[187160]: 2025-12-05 12:49:04.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:49:04 np0005546954 nova_compute[187160]: 2025-12-05 12:49:04.318 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:49:04 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:49:04.505 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2a:56:46', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '32:90:88:ab:74:32'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:49:04 np0005546954 nova_compute[187160]: 2025-12-05 12:49:04.507 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:49:04 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:49:04.507 104428 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 07:49:05 np0005546954 nova_compute[187160]: 2025-12-05 12:49:05.209 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:49:05 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:49:05.510 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f9f74c-08f9-451f-9678-93bb9e8fa2fe, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:49:05 np0005546954 podman[197513]: time="2025-12-05T12:49:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:49:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:49:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 07:49:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:49:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2583 "" "Go-http-client/1.1"
Dec  5 07:49:09 np0005546954 nova_compute[187160]: 2025-12-05 12:49:09.328 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:49:10 np0005546954 nova_compute[187160]: 2025-12-05 12:49:10.211 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:49:13 np0005546954 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec  5 07:49:14 np0005546954 nova_compute[187160]: 2025-12-05 12:49:14.652 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:49:15 np0005546954 nova_compute[187160]: 2025-12-05 12:49:15.213 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:49:15 np0005546954 podman[211135]: 2025-12-05 12:49:15.572223432 +0000 UTC m=+0.078081335 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Dec  5 07:49:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:49:16.945 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:49:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:49:16.946 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:49:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:49:16.946 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:49:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:49:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:49:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:49:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 07:49:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:49:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:49:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:49:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 07:49:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:49:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:49:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 07:49:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:49:19 np0005546954 podman[211156]: 2025-12-05 12:49:19.522135384 +0000 UTC m=+0.061963356 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 07:49:19 np0005546954 podman[211155]: 2025-12-05 12:49:19.554057431 +0000 UTC m=+0.101865845 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS)
Dec  5 07:49:19 np0005546954 nova_compute[187160]: 2025-12-05 12:49:19.655 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:49:20 np0005546954 nova_compute[187160]: 2025-12-05 12:49:20.215 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:49:24 np0005546954 nova_compute[187160]: 2025-12-05 12:49:24.656 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:49:25 np0005546954 nova_compute[187160]: 2025-12-05 12:49:25.217 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:49:29 np0005546954 nova_compute[187160]: 2025-12-05 12:49:29.659 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:49:30 np0005546954 nova_compute[187160]: 2025-12-05 12:49:30.221 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:49:32 np0005546954 podman[211204]: 2025-12-05 12:49:32.60230339 +0000 UTC m=+0.101913017 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd)
Dec  5 07:49:32 np0005546954 podman[211203]: 2025-12-05 12:49:32.613141102 +0000 UTC m=+0.120220324 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, version=9.6, com.redhat.component=ubi9-minimal-container, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9)
Dec  5 07:49:32 np0005546954 nova_compute[187160]: 2025-12-05 12:49:32.747 187164 DEBUG oslo_concurrency.lockutils [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Acquiring lock "47ffc1ff-837b-495c-a5ec-6a8b95d36137" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:49:32 np0005546954 nova_compute[187160]: 2025-12-05 12:49:32.748 187164 DEBUG oslo_concurrency.lockutils [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Lock "47ffc1ff-837b-495c-a5ec-6a8b95d36137" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:49:32 np0005546954 nova_compute[187160]: 2025-12-05 12:49:32.764 187164 DEBUG nova.compute.manager [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:49:32 np0005546954 nova_compute[187160]: 2025-12-05 12:49:32.832 187164 DEBUG oslo_concurrency.lockutils [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:49:32 np0005546954 nova_compute[187160]: 2025-12-05 12:49:32.833 187164 DEBUG oslo_concurrency.lockutils [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:49:32 np0005546954 nova_compute[187160]: 2025-12-05 12:49:32.840 187164 DEBUG nova.virt.hardware [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:49:32 np0005546954 nova_compute[187160]: 2025-12-05 12:49:32.840 187164 INFO nova.compute.claims [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Claim successful on node compute-1.ctlplane.example.com#033[00m
Dec  5 07:49:32 np0005546954 nova_compute[187160]: 2025-12-05 12:49:32.943 187164 DEBUG nova.compute.provider_tree [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:49:32 np0005546954 nova_compute[187160]: 2025-12-05 12:49:32.956 187164 DEBUG nova.scheduler.client.report [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:49:32 np0005546954 nova_compute[187160]: 2025-12-05 12:49:32.983 187164 DEBUG oslo_concurrency.lockutils [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.150s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:49:32 np0005546954 nova_compute[187160]: 2025-12-05 12:49:32.984 187164 DEBUG nova.compute.manager [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:49:33 np0005546954 nova_compute[187160]: 2025-12-05 12:49:33.032 187164 DEBUG nova.compute.manager [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:49:33 np0005546954 nova_compute[187160]: 2025-12-05 12:49:33.032 187164 DEBUG nova.network.neutron [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:49:33 np0005546954 nova_compute[187160]: 2025-12-05 12:49:33.052 187164 INFO nova.virt.libvirt.driver [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:49:33 np0005546954 nova_compute[187160]: 2025-12-05 12:49:33.067 187164 DEBUG nova.compute.manager [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:49:33 np0005546954 nova_compute[187160]: 2025-12-05 12:49:33.146 187164 DEBUG nova.compute.manager [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:49:33 np0005546954 nova_compute[187160]: 2025-12-05 12:49:33.147 187164 DEBUG nova.virt.libvirt.driver [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:49:33 np0005546954 nova_compute[187160]: 2025-12-05 12:49:33.148 187164 INFO nova.virt.libvirt.driver [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Creating image(s)#033[00m
Dec  5 07:49:33 np0005546954 nova_compute[187160]: 2025-12-05 12:49:33.148 187164 DEBUG oslo_concurrency.lockutils [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Acquiring lock "/var/lib/nova/instances/47ffc1ff-837b-495c-a5ec-6a8b95d36137/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:49:33 np0005546954 nova_compute[187160]: 2025-12-05 12:49:33.149 187164 DEBUG oslo_concurrency.lockutils [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Lock "/var/lib/nova/instances/47ffc1ff-837b-495c-a5ec-6a8b95d36137/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:49:33 np0005546954 nova_compute[187160]: 2025-12-05 12:49:33.149 187164 DEBUG oslo_concurrency.lockutils [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Lock "/var/lib/nova/instances/47ffc1ff-837b-495c-a5ec-6a8b95d36137/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:49:33 np0005546954 nova_compute[187160]: 2025-12-05 12:49:33.162 187164 DEBUG oslo_concurrency.processutils [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:49:33 np0005546954 nova_compute[187160]: 2025-12-05 12:49:33.226 187164 DEBUG oslo_concurrency.processutils [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:49:33 np0005546954 nova_compute[187160]: 2025-12-05 12:49:33.228 187164 DEBUG oslo_concurrency.lockutils [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Acquiring lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:49:33 np0005546954 nova_compute[187160]: 2025-12-05 12:49:33.228 187164 DEBUG oslo_concurrency.lockutils [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:49:33 np0005546954 nova_compute[187160]: 2025-12-05 12:49:33.239 187164 DEBUG oslo_concurrency.processutils [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:49:33 np0005546954 nova_compute[187160]: 2025-12-05 12:49:33.298 187164 DEBUG oslo_concurrency.processutils [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:49:33 np0005546954 nova_compute[187160]: 2025-12-05 12:49:33.299 187164 DEBUG oslo_concurrency.processutils [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/47ffc1ff-837b-495c-a5ec-6a8b95d36137/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:49:33 np0005546954 nova_compute[187160]: 2025-12-05 12:49:33.452 187164 DEBUG nova.policy [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '40d645c136824137bea297268f8a9cee', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7e0665b21e8d4fc092797e18e0320f99', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:49:33 np0005546954 nova_compute[187160]: 2025-12-05 12:49:33.505 187164 DEBUG oslo_concurrency.processutils [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/47ffc1ff-837b-495c-a5ec-6a8b95d36137/disk 1073741824" returned: 0 in 0.206s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:49:33 np0005546954 nova_compute[187160]: 2025-12-05 12:49:33.506 187164 DEBUG oslo_concurrency.lockutils [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.278s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:49:33 np0005546954 nova_compute[187160]: 2025-12-05 12:49:33.507 187164 DEBUG oslo_concurrency.processutils [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:49:33 np0005546954 nova_compute[187160]: 2025-12-05 12:49:33.571 187164 DEBUG oslo_concurrency.processutils [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:49:33 np0005546954 nova_compute[187160]: 2025-12-05 12:49:33.572 187164 DEBUG nova.virt.disk.api [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Checking if we can resize image /var/lib/nova/instances/47ffc1ff-837b-495c-a5ec-6a8b95d36137/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:49:33 np0005546954 nova_compute[187160]: 2025-12-05 12:49:33.572 187164 DEBUG oslo_concurrency.processutils [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/47ffc1ff-837b-495c-a5ec-6a8b95d36137/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:49:33 np0005546954 nova_compute[187160]: 2025-12-05 12:49:33.651 187164 DEBUG oslo_concurrency.processutils [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/47ffc1ff-837b-495c-a5ec-6a8b95d36137/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:49:33 np0005546954 nova_compute[187160]: 2025-12-05 12:49:33.653 187164 DEBUG nova.virt.disk.api [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Cannot resize image /var/lib/nova/instances/47ffc1ff-837b-495c-a5ec-6a8b95d36137/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:49:33 np0005546954 nova_compute[187160]: 2025-12-05 12:49:33.653 187164 DEBUG nova.objects.instance [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Lazy-loading 'migration_context' on Instance uuid 47ffc1ff-837b-495c-a5ec-6a8b95d36137 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:49:33 np0005546954 nova_compute[187160]: 2025-12-05 12:49:33.667 187164 DEBUG nova.virt.libvirt.driver [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:49:33 np0005546954 nova_compute[187160]: 2025-12-05 12:49:33.668 187164 DEBUG nova.virt.libvirt.driver [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Ensure instance console log exists: /var/lib/nova/instances/47ffc1ff-837b-495c-a5ec-6a8b95d36137/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:49:33 np0005546954 nova_compute[187160]: 2025-12-05 12:49:33.669 187164 DEBUG oslo_concurrency.lockutils [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:49:33 np0005546954 nova_compute[187160]: 2025-12-05 12:49:33.669 187164 DEBUG oslo_concurrency.lockutils [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:49:33 np0005546954 nova_compute[187160]: 2025-12-05 12:49:33.669 187164 DEBUG oslo_concurrency.lockutils [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:49:33 np0005546954 nova_compute[187160]: 2025-12-05 12:49:33.937 187164 DEBUG nova.network.neutron [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Successfully created port: 5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:49:34 np0005546954 nova_compute[187160]: 2025-12-05 12:49:34.660 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:49:34 np0005546954 nova_compute[187160]: 2025-12-05 12:49:34.682 187164 DEBUG nova.network.neutron [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Successfully updated port: 5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:49:34 np0005546954 nova_compute[187160]: 2025-12-05 12:49:34.700 187164 DEBUG oslo_concurrency.lockutils [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Acquiring lock "refresh_cache-47ffc1ff-837b-495c-a5ec-6a8b95d36137" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:49:34 np0005546954 nova_compute[187160]: 2025-12-05 12:49:34.701 187164 DEBUG oslo_concurrency.lockutils [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Acquired lock "refresh_cache-47ffc1ff-837b-495c-a5ec-6a8b95d36137" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:49:34 np0005546954 nova_compute[187160]: 2025-12-05 12:49:34.701 187164 DEBUG nova.network.neutron [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:49:34 np0005546954 nova_compute[187160]: 2025-12-05 12:49:34.800 187164 DEBUG nova.compute.manager [req-ea66f3a4-3e71-486d-9503-251d652153f5 req-2207ba35-920e-49bf-8a89-952409c9ae5c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Received event network-changed-5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:49:34 np0005546954 nova_compute[187160]: 2025-12-05 12:49:34.801 187164 DEBUG nova.compute.manager [req-ea66f3a4-3e71-486d-9503-251d652153f5 req-2207ba35-920e-49bf-8a89-952409c9ae5c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Refreshing instance network info cache due to event network-changed-5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:49:34 np0005546954 nova_compute[187160]: 2025-12-05 12:49:34.801 187164 DEBUG oslo_concurrency.lockutils [req-ea66f3a4-3e71-486d-9503-251d652153f5 req-2207ba35-920e-49bf-8a89-952409c9ae5c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "refresh_cache-47ffc1ff-837b-495c-a5ec-6a8b95d36137" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:49:34 np0005546954 nova_compute[187160]: 2025-12-05 12:49:34.875 187164 DEBUG nova.network.neutron [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.224 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:49:35 np0005546954 podman[197513]: time="2025-12-05T12:49:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:49:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:49:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 07:49:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:49:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2581 "" "Go-http-client/1.1"
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.698 187164 DEBUG nova.network.neutron [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Updating instance_info_cache with network_info: [{"id": "5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676", "address": "fa:16:3e:28:d9:1b", "network": {"id": "6e67014a-5792-45fa-ac5c-49089f8dc0ef", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1349109889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e0665b21e8d4fc092797e18e0320f99", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5a3cdfcf-aa", "ovs_interfaceid": "5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.721 187164 DEBUG oslo_concurrency.lockutils [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Releasing lock "refresh_cache-47ffc1ff-837b-495c-a5ec-6a8b95d36137" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.721 187164 DEBUG nova.compute.manager [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Instance network_info: |[{"id": "5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676", "address": "fa:16:3e:28:d9:1b", "network": {"id": "6e67014a-5792-45fa-ac5c-49089f8dc0ef", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1349109889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e0665b21e8d4fc092797e18e0320f99", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5a3cdfcf-aa", "ovs_interfaceid": "5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.722 187164 DEBUG oslo_concurrency.lockutils [req-ea66f3a4-3e71-486d-9503-251d652153f5 req-2207ba35-920e-49bf-8a89-952409c9ae5c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquired lock "refresh_cache-47ffc1ff-837b-495c-a5ec-6a8b95d36137" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.722 187164 DEBUG nova.network.neutron [req-ea66f3a4-3e71-486d-9503-251d652153f5 req-2207ba35-920e-49bf-8a89-952409c9ae5c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Refreshing network info cache for port 5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.724 187164 DEBUG nova.virt.libvirt.driver [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Start _get_guest_xml network_info=[{"id": "5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676", "address": "fa:16:3e:28:d9:1b", "network": {"id": "6e67014a-5792-45fa-ac5c-49089f8dc0ef", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1349109889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e0665b21e8d4fc092797e18e0320f99", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5a3cdfcf-aa", "ovs_interfaceid": "5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T12:39:17Z,direct_url=<?>,disk_format='qcow2',id=f4c3125a-6fd0-40bb-aa00-a7e736ee853d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='83916c53de6f404f91206339303e1b23',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T12:39:21Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'size': 0, 'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'encrypted': False, 'image_id': 'f4c3125a-6fd0-40bb-aa00-a7e736ee853d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.729 187164 WARNING nova.virt.libvirt.driver [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.736 187164 DEBUG nova.virt.libvirt.host [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.738 187164 DEBUG nova.virt.libvirt.host [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.747 187164 DEBUG nova.virt.libvirt.host [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.748 187164 DEBUG nova.virt.libvirt.host [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.749 187164 DEBUG nova.virt.libvirt.driver [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.749 187164 DEBUG nova.virt.hardware [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T12:39:15Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b4ea63be-97f8-4a48-b000-66321c4ddb27',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T12:39:17Z,direct_url=<?>,disk_format='qcow2',id=f4c3125a-6fd0-40bb-aa00-a7e736ee853d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='83916c53de6f404f91206339303e1b23',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T12:39:21Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.750 187164 DEBUG nova.virt.hardware [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.750 187164 DEBUG nova.virt.hardware [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.750 187164 DEBUG nova.virt.hardware [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.750 187164 DEBUG nova.virt.hardware [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.750 187164 DEBUG nova.virt.hardware [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.751 187164 DEBUG nova.virt.hardware [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.751 187164 DEBUG nova.virt.hardware [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.751 187164 DEBUG nova.virt.hardware [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.751 187164 DEBUG nova.virt.hardware [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.752 187164 DEBUG nova.virt.hardware [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.755 187164 DEBUG nova.virt.libvirt.vif [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:49:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-799514834',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-799514834',id=10,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7e0665b21e8d4fc092797e18e0320f99',ramdisk_id='',reservation_id='r-731kb9gv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-680209779',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-680209779-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:49:33Z,user_data=None,user_id='40d645c136824137bea297268f8a9cee',uuid=47ffc1ff-837b-495c-a5ec-6a8b95d36137,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676", "address": "fa:16:3e:28:d9:1b", "network": {"id": "6e67014a-5792-45fa-ac5c-49089f8dc0ef", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1349109889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e0665b21e8d4fc092797e18e0320f99", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5a3cdfcf-aa", "ovs_interfaceid": "5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.755 187164 DEBUG nova.network.os_vif_util [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Converting VIF {"id": "5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676", "address": "fa:16:3e:28:d9:1b", "network": {"id": "6e67014a-5792-45fa-ac5c-49089f8dc0ef", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1349109889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e0665b21e8d4fc092797e18e0320f99", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5a3cdfcf-aa", "ovs_interfaceid": "5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.756 187164 DEBUG nova.network.os_vif_util [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:28:d9:1b,bridge_name='br-int',has_traffic_filtering=True,id=5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676,network=Network(6e67014a-5792-45fa-ac5c-49089f8dc0ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5a3cdfcf-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.757 187164 DEBUG nova.objects.instance [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Lazy-loading 'pci_devices' on Instance uuid 47ffc1ff-837b-495c-a5ec-6a8b95d36137 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.777 187164 DEBUG nova.virt.libvirt.driver [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:49:35 np0005546954 nova_compute[187160]:  <uuid>47ffc1ff-837b-495c-a5ec-6a8b95d36137</uuid>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:  <name>instance-0000000a</name>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:  <memory>131072</memory>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:  <vcpu>1</vcpu>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:  <metadata>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:49:35 np0005546954 nova_compute[187160]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:      <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-799514834</nova:name>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:      <nova:creationTime>2025-12-05 12:49:35</nova:creationTime>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:      <nova:flavor name="m1.nano">
Dec  5 07:49:35 np0005546954 nova_compute[187160]:        <nova:memory>128</nova:memory>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:        <nova:disk>1</nova:disk>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:        <nova:swap>0</nova:swap>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:      </nova:flavor>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:      <nova:owner>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:        <nova:user uuid="40d645c136824137bea297268f8a9cee">tempest-TestExecuteHostMaintenanceStrategy-680209779-project-member</nova:user>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:        <nova:project uuid="7e0665b21e8d4fc092797e18e0320f99">tempest-TestExecuteHostMaintenanceStrategy-680209779</nova:project>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:      </nova:owner>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:      <nova:root type="image" uuid="f4c3125a-6fd0-40bb-aa00-a7e736ee853d"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:      <nova:ports>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:        <nova:port uuid="5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676">
Dec  5 07:49:35 np0005546954 nova_compute[187160]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:        </nova:port>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:      </nova:ports>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    </nova:instance>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:  </metadata>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:  <sysinfo type="smbios">
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <system>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:      <entry name="serial">47ffc1ff-837b-495c-a5ec-6a8b95d36137</entry>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:      <entry name="uuid">47ffc1ff-837b-495c-a5ec-6a8b95d36137</entry>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    </system>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:  </sysinfo>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:  <os>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <boot dev="hd"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <smbios mode="sysinfo"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:  </os>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:  <features>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <acpi/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <apic/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <vmcoreinfo/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:  </features>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:  <clock offset="utc">
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <timer name="hpet" present="no"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:  </clock>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:  <cpu mode="custom" match="exact">
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <model>Nehalem</model>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:  </cpu>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:  <devices>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <disk type="file" device="disk">
Dec  5 07:49:35 np0005546954 nova_compute[187160]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:      <source file="/var/lib/nova/instances/47ffc1ff-837b-495c-a5ec-6a8b95d36137/disk"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:      <target dev="vda" bus="virtio"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    </disk>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <disk type="file" device="cdrom">
Dec  5 07:49:35 np0005546954 nova_compute[187160]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:      <source file="/var/lib/nova/instances/47ffc1ff-837b-495c-a5ec-6a8b95d36137/disk.config"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:      <target dev="sda" bus="sata"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    </disk>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <interface type="ethernet">
Dec  5 07:49:35 np0005546954 nova_compute[187160]:      <mac address="fa:16:3e:28:d9:1b"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:      <model type="virtio"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:      <mtu size="1442"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:      <target dev="tap5a3cdfcf-aa"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    </interface>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <serial type="pty">
Dec  5 07:49:35 np0005546954 nova_compute[187160]:      <log file="/var/lib/nova/instances/47ffc1ff-837b-495c-a5ec-6a8b95d36137/console.log" append="off"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    </serial>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <video>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:      <model type="virtio"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    </video>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <input type="tablet" bus="usb"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <rng model="virtio">
Dec  5 07:49:35 np0005546954 nova_compute[187160]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    </rng>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <controller type="usb" index="0"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    <memballoon model="virtio">
Dec  5 07:49:35 np0005546954 nova_compute[187160]:      <stats period="10"/>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:    </memballoon>
Dec  5 07:49:35 np0005546954 nova_compute[187160]:  </devices>
Dec  5 07:49:35 np0005546954 nova_compute[187160]: </domain>
Dec  5 07:49:35 np0005546954 nova_compute[187160]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.778 187164 DEBUG nova.compute.manager [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Preparing to wait for external event network-vif-plugged-5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.778 187164 DEBUG oslo_concurrency.lockutils [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Acquiring lock "47ffc1ff-837b-495c-a5ec-6a8b95d36137-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.779 187164 DEBUG oslo_concurrency.lockutils [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Lock "47ffc1ff-837b-495c-a5ec-6a8b95d36137-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.779 187164 DEBUG oslo_concurrency.lockutils [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Lock "47ffc1ff-837b-495c-a5ec-6a8b95d36137-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.780 187164 DEBUG nova.virt.libvirt.vif [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:49:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-799514834',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-799514834',id=10,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7e0665b21e8d4fc092797e18e0320f99',ramdisk_id='',reservation_id='r-731kb9gv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-680209779',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-680209779-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:49:33Z,user_data=None,user_id='40d645c136824137bea297268f8a9cee',uuid=47ffc1ff-837b-495c-a5ec-6a8b95d36137,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676", "address": "fa:16:3e:28:d9:1b", "network": {"id": "6e67014a-5792-45fa-ac5c-49089f8dc0ef", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1349109889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e0665b21e8d4fc092797e18e0320f99", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5a3cdfcf-aa", "ovs_interfaceid": "5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.780 187164 DEBUG nova.network.os_vif_util [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Converting VIF {"id": "5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676", "address": "fa:16:3e:28:d9:1b", "network": {"id": "6e67014a-5792-45fa-ac5c-49089f8dc0ef", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1349109889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e0665b21e8d4fc092797e18e0320f99", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5a3cdfcf-aa", "ovs_interfaceid": "5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.781 187164 DEBUG nova.network.os_vif_util [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:28:d9:1b,bridge_name='br-int',has_traffic_filtering=True,id=5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676,network=Network(6e67014a-5792-45fa-ac5c-49089f8dc0ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5a3cdfcf-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.781 187164 DEBUG os_vif [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:28:d9:1b,bridge_name='br-int',has_traffic_filtering=True,id=5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676,network=Network(6e67014a-5792-45fa-ac5c-49089f8dc0ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5a3cdfcf-aa') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.782 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.782 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.782 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.786 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.786 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5a3cdfcf-aa, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.787 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5a3cdfcf-aa, col_values=(('external_ids', {'iface-id': '5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:28:d9:1b', 'vm-uuid': '47ffc1ff-837b-495c-a5ec-6a8b95d36137'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.788 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:49:35 np0005546954 NetworkManager[55665]: <info>  [1764938975.7896] manager: (tap5a3cdfcf-aa): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/40)
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.791 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.796 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.798 187164 INFO os_vif [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:28:d9:1b,bridge_name='br-int',has_traffic_filtering=True,id=5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676,network=Network(6e67014a-5792-45fa-ac5c-49089f8dc0ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5a3cdfcf-aa')#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.843 187164 DEBUG nova.virt.libvirt.driver [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.843 187164 DEBUG nova.virt.libvirt.driver [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.844 187164 DEBUG nova.virt.libvirt.driver [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] No VIF found with MAC fa:16:3e:28:d9:1b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:49:35 np0005546954 nova_compute[187160]: 2025-12-05 12:49:35.844 187164 INFO nova.virt.libvirt.driver [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Using config drive#033[00m
Dec  5 07:49:36 np0005546954 nova_compute[187160]: 2025-12-05 12:49:36.101 187164 INFO nova.virt.libvirt.driver [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Creating config drive at /var/lib/nova/instances/47ffc1ff-837b-495c-a5ec-6a8b95d36137/disk.config#033[00m
Dec  5 07:49:36 np0005546954 nova_compute[187160]: 2025-12-05 12:49:36.106 187164 DEBUG oslo_concurrency.processutils [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/47ffc1ff-837b-495c-a5ec-6a8b95d36137/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfm5i55ev execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:49:36 np0005546954 nova_compute[187160]: 2025-12-05 12:49:36.235 187164 DEBUG oslo_concurrency.processutils [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/47ffc1ff-837b-495c-a5ec-6a8b95d36137/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfm5i55ev" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:49:36 np0005546954 kernel: tap5a3cdfcf-aa: entered promiscuous mode
Dec  5 07:49:36 np0005546954 NetworkManager[55665]: <info>  [1764938976.3153] manager: (tap5a3cdfcf-aa): new Tun device (/org/freedesktop/NetworkManager/Devices/41)
Dec  5 07:49:36 np0005546954 nova_compute[187160]: 2025-12-05 12:49:36.314 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:49:36 np0005546954 ovn_controller[95566]: 2025-12-05T12:49:36Z|00087|binding|INFO|Claiming lport 5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676 for this chassis.
Dec  5 07:49:36 np0005546954 ovn_controller[95566]: 2025-12-05T12:49:36Z|00088|binding|INFO|5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676: Claiming fa:16:3e:28:d9:1b 10.100.0.5
Dec  5 07:49:36 np0005546954 nova_compute[187160]: 2025-12-05 12:49:36.317 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:49:36 np0005546954 nova_compute[187160]: 2025-12-05 12:49:36.325 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:49:36.334 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:28:d9:1b 10.100.0.5'], port_security=['fa:16:3e:28:d9:1b 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '47ffc1ff-837b-495c-a5ec-6a8b95d36137', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6e67014a-5792-45fa-ac5c-49089f8dc0ef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7e0665b21e8d4fc092797e18e0320f99', 'neutron:revision_number': '2', 'neutron:security_group_ids': '66fe94db-41ba-4d2d-b949-4aa77c13cd33', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8d4f604e-5305-48b4-8813-83692bab39ba, chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:49:36.337 104428 INFO neutron.agent.ovn.metadata.agent [-] Port 5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676 in datapath 6e67014a-5792-45fa-ac5c-49089f8dc0ef bound to our chassis#033[00m
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:49:36.340 104428 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6e67014a-5792-45fa-ac5c-49089f8dc0ef#033[00m
Dec  5 07:49:36 np0005546954 systemd-udevd[211277]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:49:36.356 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[b509afc4-48a2-42d2-9498-7523c3508c10]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:49:36.359 104428 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6e67014a-51 in ovnmeta-6e67014a-5792-45fa-ac5c-49089f8dc0ef namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:49:36 np0005546954 NetworkManager[55665]: <info>  [1764938976.3612] device (tap5a3cdfcf-aa): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:49:36 np0005546954 NetworkManager[55665]: <info>  [1764938976.3628] device (tap5a3cdfcf-aa): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:49:36 np0005546954 systemd-machined[153497]: New machine qemu-8-instance-0000000a.
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:49:36.363 208690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6e67014a-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:49:36.363 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[2db5aa2e-ecf2-4aab-a0ae-baa78985e060]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:49:36.364 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[60e46696-fb09-4a74-b549-c466115b2b75]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:49:36 np0005546954 nova_compute[187160]: 2025-12-05 12:49:36.379 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:49:36 np0005546954 ovn_controller[95566]: 2025-12-05T12:49:36Z|00089|binding|INFO|Setting lport 5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676 ovn-installed in OVS
Dec  5 07:49:36 np0005546954 ovn_controller[95566]: 2025-12-05T12:49:36Z|00090|binding|INFO|Setting lport 5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676 up in Southbound
Dec  5 07:49:36 np0005546954 nova_compute[187160]: 2025-12-05 12:49:36.382 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:49:36.382 104542 DEBUG oslo.privsep.daemon [-] privsep: reply[86fe0483-45f9-45f0-9fb9-d2db5eff5d36]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:49:36 np0005546954 systemd[1]: Started Virtual Machine qemu-8-instance-0000000a.
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:49:36.409 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[09cb893c-c573-47b6-a67c-671ee68e4025]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:49:36.439 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[7059b706-94e5-4658-99de-50a0e9b0975a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:49:36.445 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[237f62d3-fad4-4462-aa5e-fb1b7c772b60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:49:36 np0005546954 NetworkManager[55665]: <info>  [1764938976.4471] manager: (tap6e67014a-50): new Veth device (/org/freedesktop/NetworkManager/Devices/42)
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:49:36.487 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[ef7d4382-56ed-45bc-b2b1-84a0f4664bb4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:49:36.490 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[a18cf15b-552a-4e98-9960-c4a778db1853]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:49:36 np0005546954 NetworkManager[55665]: <info>  [1764938976.5172] device (tap6e67014a-50): carrier: link connected
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:49:36.525 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[b26806b6-3ebf-41bd-b628-3ddb4d90e3a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:49:36.545 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[96af54dc-95ac-49df-9219-b2b4b3667d76]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6e67014a-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f0:72:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 398314, 'reachable_time': 30269, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211311, 'error': None, 'target': 'ovnmeta-6e67014a-5792-45fa-ac5c-49089f8dc0ef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:49:36.561 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[50ba9ab9-e3bc-462d-a650-8671cb9ccc37]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef0:72e6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 398314, 'tstamp': 398314}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211312, 'error': None, 'target': 'ovnmeta-6e67014a-5792-45fa-ac5c-49089f8dc0ef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:49:36.580 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[f0a36352-8d85-4b5f-8f8c-e5f3f4738fb4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6e67014a-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f0:72:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 398314, 'reachable_time': 30269, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 211313, 'error': None, 'target': 'ovnmeta-6e67014a-5792-45fa-ac5c-49089f8dc0ef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:49:36.612 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[d664ce3a-3263-460c-899f-0eeb7ccf58a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:49:36 np0005546954 nova_compute[187160]: 2025-12-05 12:49:36.668 187164 DEBUG nova.network.neutron [req-ea66f3a4-3e71-486d-9503-251d652153f5 req-2207ba35-920e-49bf-8a89-952409c9ae5c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Updated VIF entry in instance network info cache for port 5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:49:36 np0005546954 nova_compute[187160]: 2025-12-05 12:49:36.669 187164 DEBUG nova.network.neutron [req-ea66f3a4-3e71-486d-9503-251d652153f5 req-2207ba35-920e-49bf-8a89-952409c9ae5c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Updating instance_info_cache with network_info: [{"id": "5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676", "address": "fa:16:3e:28:d9:1b", "network": {"id": "6e67014a-5792-45fa-ac5c-49089f8dc0ef", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1349109889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e0665b21e8d4fc092797e18e0320f99", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5a3cdfcf-aa", "ovs_interfaceid": "5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:49:36 np0005546954 nova_compute[187160]: 2025-12-05 12:49:36.679 187164 DEBUG nova.compute.manager [req-2723e74b-63d7-4de8-baab-74cb3fa9eeda req-3eb47e5a-b871-4838-943f-b3e318ed3ed6 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Received event network-vif-plugged-5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:49:36 np0005546954 nova_compute[187160]: 2025-12-05 12:49:36.679 187164 DEBUG oslo_concurrency.lockutils [req-2723e74b-63d7-4de8-baab-74cb3fa9eeda req-3eb47e5a-b871-4838-943f-b3e318ed3ed6 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "47ffc1ff-837b-495c-a5ec-6a8b95d36137-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:49:36 np0005546954 nova_compute[187160]: 2025-12-05 12:49:36.680 187164 DEBUG oslo_concurrency.lockutils [req-2723e74b-63d7-4de8-baab-74cb3fa9eeda req-3eb47e5a-b871-4838-943f-b3e318ed3ed6 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "47ffc1ff-837b-495c-a5ec-6a8b95d36137-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:49:36 np0005546954 nova_compute[187160]: 2025-12-05 12:49:36.680 187164 DEBUG oslo_concurrency.lockutils [req-2723e74b-63d7-4de8-baab-74cb3fa9eeda req-3eb47e5a-b871-4838-943f-b3e318ed3ed6 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "47ffc1ff-837b-495c-a5ec-6a8b95d36137-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:49:36 np0005546954 nova_compute[187160]: 2025-12-05 12:49:36.680 187164 DEBUG nova.compute.manager [req-2723e74b-63d7-4de8-baab-74cb3fa9eeda req-3eb47e5a-b871-4838-943f-b3e318ed3ed6 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Processing event network-vif-plugged-5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:49:36 np0005546954 nova_compute[187160]: 2025-12-05 12:49:36.684 187164 DEBUG oslo_concurrency.lockutils [req-ea66f3a4-3e71-486d-9503-251d652153f5 req-2207ba35-920e-49bf-8a89-952409c9ae5c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Releasing lock "refresh_cache-47ffc1ff-837b-495c-a5ec-6a8b95d36137" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:49:36.692 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[12aa0fea-a0dc-41d0-8b16-79e9b4c2ae85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:49:36.695 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6e67014a-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:49:36.696 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:49:36.696 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6e67014a-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:49:36 np0005546954 nova_compute[187160]: 2025-12-05 12:49:36.698 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:49:36 np0005546954 kernel: tap6e67014a-50: entered promiscuous mode
Dec  5 07:49:36 np0005546954 NetworkManager[55665]: <info>  [1764938976.6997] manager: (tap6e67014a-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/43)
Dec  5 07:49:36 np0005546954 nova_compute[187160]: 2025-12-05 12:49:36.700 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:49:36.702 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6e67014a-50, col_values=(('external_ids', {'iface-id': 'd323474c-8dc0-4d9e-a642-0592117c8324'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:49:36 np0005546954 nova_compute[187160]: 2025-12-05 12:49:36.703 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:49:36 np0005546954 ovn_controller[95566]: 2025-12-05T12:49:36Z|00091|binding|INFO|Releasing lport d323474c-8dc0-4d9e-a642-0592117c8324 from this chassis (sb_readonly=0)
Dec  5 07:49:36 np0005546954 nova_compute[187160]: 2025-12-05 12:49:36.718 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:49:36.722 104428 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6e67014a-5792-45fa-ac5c-49089f8dc0ef.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6e67014a-5792-45fa-ac5c-49089f8dc0ef.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:49:36.723 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[4b159f44-a8fd-4d62-bab9-30ae8807cefc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:49:36.725 104428 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]: global
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]:    log         /dev/log local0 debug
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]:    log-tag     haproxy-metadata-proxy-6e67014a-5792-45fa-ac5c-49089f8dc0ef
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]:    user        root
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]:    group       root
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]:    maxconn     1024
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]:    pidfile     /var/lib/neutron/external/pids/6e67014a-5792-45fa-ac5c-49089f8dc0ef.pid.haproxy
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]:    daemon
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]: 
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]: defaults
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]:    log global
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]:    mode http
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]:    option httplog
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]:    option dontlognull
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]:    option http-server-close
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]:    option forwardfor
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]:    retries                 3
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]:    timeout http-request    30s
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]:    timeout connect         30s
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]:    timeout client          32s
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]:    timeout server          32s
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]:    timeout http-keep-alive 30s
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]: 
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]: 
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]: listen listener
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]:    bind 169.254.169.254:80
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]:    http-request add-header X-OVN-Network-ID 6e67014a-5792-45fa-ac5c-49089f8dc0ef
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:49:36 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:49:36.726 104428 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6e67014a-5792-45fa-ac5c-49089f8dc0ef', 'env', 'PROCESS_TAG=haproxy-6e67014a-5792-45fa-ac5c-49089f8dc0ef', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6e67014a-5792-45fa-ac5c-49089f8dc0ef.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:49:36 np0005546954 nova_compute[187160]: 2025-12-05 12:49:36.884 187164 DEBUG nova.compute.manager [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:49:36 np0005546954 nova_compute[187160]: 2025-12-05 12:49:36.885 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764938976.8830976, 47ffc1ff-837b-495c-a5ec-6a8b95d36137 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:49:36 np0005546954 nova_compute[187160]: 2025-12-05 12:49:36.885 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] VM Started (Lifecycle Event)#033[00m
Dec  5 07:49:36 np0005546954 nova_compute[187160]: 2025-12-05 12:49:36.896 187164 DEBUG nova.virt.libvirt.driver [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:49:36 np0005546954 nova_compute[187160]: 2025-12-05 12:49:36.908 187164 INFO nova.virt.libvirt.driver [-] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Instance spawned successfully.#033[00m
Dec  5 07:49:36 np0005546954 nova_compute[187160]: 2025-12-05 12:49:36.909 187164 DEBUG nova.virt.libvirt.driver [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:49:36 np0005546954 nova_compute[187160]: 2025-12-05 12:49:36.913 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:49:36 np0005546954 nova_compute[187160]: 2025-12-05 12:49:36.917 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:49:36 np0005546954 nova_compute[187160]: 2025-12-05 12:49:36.933 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:49:36 np0005546954 nova_compute[187160]: 2025-12-05 12:49:36.933 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764938976.8833344, 47ffc1ff-837b-495c-a5ec-6a8b95d36137 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:49:36 np0005546954 nova_compute[187160]: 2025-12-05 12:49:36.934 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:49:36 np0005546954 nova_compute[187160]: 2025-12-05 12:49:36.937 187164 DEBUG nova.virt.libvirt.driver [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:49:36 np0005546954 nova_compute[187160]: 2025-12-05 12:49:36.937 187164 DEBUG nova.virt.libvirt.driver [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:49:36 np0005546954 nova_compute[187160]: 2025-12-05 12:49:36.938 187164 DEBUG nova.virt.libvirt.driver [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:49:36 np0005546954 nova_compute[187160]: 2025-12-05 12:49:36.938 187164 DEBUG nova.virt.libvirt.driver [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:49:36 np0005546954 nova_compute[187160]: 2025-12-05 12:49:36.939 187164 DEBUG nova.virt.libvirt.driver [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:49:36 np0005546954 nova_compute[187160]: 2025-12-05 12:49:36.939 187164 DEBUG nova.virt.libvirt.driver [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:49:36 np0005546954 nova_compute[187160]: 2025-12-05 12:49:36.964 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:49:36 np0005546954 nova_compute[187160]: 2025-12-05 12:49:36.968 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764938976.8948019, 47ffc1ff-837b-495c-a5ec-6a8b95d36137 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:49:36 np0005546954 nova_compute[187160]: 2025-12-05 12:49:36.968 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:49:36 np0005546954 nova_compute[187160]: 2025-12-05 12:49:36.995 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:49:37 np0005546954 nova_compute[187160]: 2025-12-05 12:49:37.000 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:49:37 np0005546954 nova_compute[187160]: 2025-12-05 12:49:37.004 187164 INFO nova.compute.manager [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Took 3.86 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:49:37 np0005546954 nova_compute[187160]: 2025-12-05 12:49:37.004 187164 DEBUG nova.compute.manager [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:49:37 np0005546954 nova_compute[187160]: 2025-12-05 12:49:37.015 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:49:37 np0005546954 nova_compute[187160]: 2025-12-05 12:49:37.066 187164 INFO nova.compute.manager [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Took 4.26 seconds to build instance.#033[00m
Dec  5 07:49:37 np0005546954 nova_compute[187160]: 2025-12-05 12:49:37.087 187164 DEBUG oslo_concurrency.lockutils [None req-0b49b00f-4331-4f2c-b774-b7e071a84286 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Lock "47ffc1ff-837b-495c-a5ec-6a8b95d36137" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.340s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:49:37 np0005546954 podman[211352]: 2025-12-05 12:49:37.156283134 +0000 UTC m=+0.056714371 container create fb11ee4d36c5611686c37d5b182f222fe8d860c79694c5dea8b69fb41fa25565 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6e67014a-5792-45fa-ac5c-49089f8dc0ef, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:49:37 np0005546954 systemd[1]: Started libpod-conmon-fb11ee4d36c5611686c37d5b182f222fe8d860c79694c5dea8b69fb41fa25565.scope.
Dec  5 07:49:37 np0005546954 podman[211352]: 2025-12-05 12:49:37.123345135 +0000 UTC m=+0.023776392 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:49:37 np0005546954 systemd[1]: Started libcrun container.
Dec  5 07:49:37 np0005546954 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e363d86e4112ed9454aec6ec80cfad5faaf10edcfdf7140c6fbd2e5573060ba5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:49:37 np0005546954 podman[211352]: 2025-12-05 12:49:37.248827504 +0000 UTC m=+0.149258771 container init fb11ee4d36c5611686c37d5b182f222fe8d860c79694c5dea8b69fb41fa25565 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6e67014a-5792-45fa-ac5c-49089f8dc0ef, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec  5 07:49:37 np0005546954 podman[211352]: 2025-12-05 12:49:37.255650349 +0000 UTC m=+0.156081586 container start fb11ee4d36c5611686c37d5b182f222fe8d860c79694c5dea8b69fb41fa25565 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6e67014a-5792-45fa-ac5c-49089f8dc0ef, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec  5 07:49:37 np0005546954 neutron-haproxy-ovnmeta-6e67014a-5792-45fa-ac5c-49089f8dc0ef[211368]: [NOTICE]   (211372) : New worker (211374) forked
Dec  5 07:49:37 np0005546954 neutron-haproxy-ovnmeta-6e67014a-5792-45fa-ac5c-49089f8dc0ef[211368]: [NOTICE]   (211372) : Loading success.
Dec  5 07:49:38 np0005546954 nova_compute[187160]: 2025-12-05 12:49:38.746 187164 DEBUG nova.compute.manager [req-a20da45b-cd49-4abb-9570-5fd5367f718d req-01659456-4402-4d87-a467-027d30013cb6 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Received event network-vif-plugged-5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:49:38 np0005546954 nova_compute[187160]: 2025-12-05 12:49:38.747 187164 DEBUG oslo_concurrency.lockutils [req-a20da45b-cd49-4abb-9570-5fd5367f718d req-01659456-4402-4d87-a467-027d30013cb6 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "47ffc1ff-837b-495c-a5ec-6a8b95d36137-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:49:38 np0005546954 nova_compute[187160]: 2025-12-05 12:49:38.748 187164 DEBUG oslo_concurrency.lockutils [req-a20da45b-cd49-4abb-9570-5fd5367f718d req-01659456-4402-4d87-a467-027d30013cb6 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "47ffc1ff-837b-495c-a5ec-6a8b95d36137-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:49:38 np0005546954 nova_compute[187160]: 2025-12-05 12:49:38.748 187164 DEBUG oslo_concurrency.lockutils [req-a20da45b-cd49-4abb-9570-5fd5367f718d req-01659456-4402-4d87-a467-027d30013cb6 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "47ffc1ff-837b-495c-a5ec-6a8b95d36137-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:49:38 np0005546954 nova_compute[187160]: 2025-12-05 12:49:38.748 187164 DEBUG nova.compute.manager [req-a20da45b-cd49-4abb-9570-5fd5367f718d req-01659456-4402-4d87-a467-027d30013cb6 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] No waiting events found dispatching network-vif-plugged-5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:49:38 np0005546954 nova_compute[187160]: 2025-12-05 12:49:38.749 187164 WARNING nova.compute.manager [req-a20da45b-cd49-4abb-9570-5fd5367f718d req-01659456-4402-4d87-a467-027d30013cb6 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Received unexpected event network-vif-plugged-5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676 for instance with vm_state active and task_state None.#033[00m
Dec  5 07:49:39 np0005546954 nova_compute[187160]: 2025-12-05 12:49:39.663 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:49:40 np0005546954 nova_compute[187160]: 2025-12-05 12:49:40.792 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:49:44 np0005546954 nova_compute[187160]: 2025-12-05 12:49:44.666 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:49:45 np0005546954 nova_compute[187160]: 2025-12-05 12:49:45.797 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:49:46 np0005546954 podman[211384]: 2025-12-05 12:49:46.59627544 +0000 UTC m=+0.097764346 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent)
Dec  5 07:49:49 np0005546954 nova_compute[187160]: 2025-12-05 12:49:49.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:49:49 np0005546954 nova_compute[187160]: 2025-12-05 12:49:49.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  5 07:49:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:49:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:49:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:49:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:49:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:49:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 07:49:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:49:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 07:49:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:49:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:49:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 07:49:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:49:49 np0005546954 ovn_controller[95566]: 2025-12-05T12:49:49Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:28:d9:1b 10.100.0.5
Dec  5 07:49:49 np0005546954 ovn_controller[95566]: 2025-12-05T12:49:49Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:28:d9:1b 10.100.0.5
Dec  5 07:49:49 np0005546954 nova_compute[187160]: 2025-12-05 12:49:49.669 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:49:50 np0005546954 podman[211425]: 2025-12-05 12:49:50.605758983 +0000 UTC m=+0.100661677 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  5 07:49:50 np0005546954 podman[211424]: 2025-12-05 12:49:50.627406646 +0000 UTC m=+0.122070132 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec  5 07:49:50 np0005546954 nova_compute[187160]: 2025-12-05 12:49:50.799 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:49:52 np0005546954 nova_compute[187160]: 2025-12-05 12:49:52.055 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:49:52 np0005546954 nova_compute[187160]: 2025-12-05 12:49:52.055 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  5 07:49:52 np0005546954 nova_compute[187160]: 2025-12-05 12:49:52.075 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  5 07:49:54 np0005546954 nova_compute[187160]: 2025-12-05 12:49:54.060 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:49:54 np0005546954 nova_compute[187160]: 2025-12-05 12:49:54.060 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:49:54 np0005546954 nova_compute[187160]: 2025-12-05 12:49:54.061 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:49:54 np0005546954 nova_compute[187160]: 2025-12-05 12:49:54.422 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "refresh_cache-47ffc1ff-837b-495c-a5ec-6a8b95d36137" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:49:54 np0005546954 nova_compute[187160]: 2025-12-05 12:49:54.423 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquired lock "refresh_cache-47ffc1ff-837b-495c-a5ec-6a8b95d36137" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:49:54 np0005546954 nova_compute[187160]: 2025-12-05 12:49:54.423 187164 DEBUG nova.network.neutron [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec  5 07:49:54 np0005546954 nova_compute[187160]: 2025-12-05 12:49:54.423 187164 DEBUG nova.objects.instance [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 47ffc1ff-837b-495c-a5ec-6a8b95d36137 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:49:54 np0005546954 nova_compute[187160]: 2025-12-05 12:49:54.671 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:49:55 np0005546954 nova_compute[187160]: 2025-12-05 12:49:55.442 187164 DEBUG nova.network.neutron [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Updating instance_info_cache with network_info: [{"id": "5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676", "address": "fa:16:3e:28:d9:1b", "network": {"id": "6e67014a-5792-45fa-ac5c-49089f8dc0ef", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1349109889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e0665b21e8d4fc092797e18e0320f99", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5a3cdfcf-aa", "ovs_interfaceid": "5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:49:55 np0005546954 nova_compute[187160]: 2025-12-05 12:49:55.464 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Releasing lock "refresh_cache-47ffc1ff-837b-495c-a5ec-6a8b95d36137" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:49:55 np0005546954 nova_compute[187160]: 2025-12-05 12:49:55.464 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec  5 07:49:55 np0005546954 nova_compute[187160]: 2025-12-05 12:49:55.465 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:49:55 np0005546954 nova_compute[187160]: 2025-12-05 12:49:55.802 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:49:56 np0005546954 nova_compute[187160]: 2025-12-05 12:49:56.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:49:58 np0005546954 nova_compute[187160]: 2025-12-05 12:49:58.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:49:58 np0005546954 nova_compute[187160]: 2025-12-05 12:49:58.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:49:59 np0005546954 nova_compute[187160]: 2025-12-05 12:49:59.674 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:00 np0005546954 nova_compute[187160]: 2025-12-05 12:50:00.267 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:50:00 np0005546954 nova_compute[187160]: 2025-12-05 12:50:00.295 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:50:00 np0005546954 nova_compute[187160]: 2025-12-05 12:50:00.296 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:50:00 np0005546954 nova_compute[187160]: 2025-12-05 12:50:00.296 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:50:00 np0005546954 nova_compute[187160]: 2025-12-05 12:50:00.297 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:50:00 np0005546954 nova_compute[187160]: 2025-12-05 12:50:00.392 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/47ffc1ff-837b-495c-a5ec-6a8b95d36137/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:50:00 np0005546954 nova_compute[187160]: 2025-12-05 12:50:00.492 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/47ffc1ff-837b-495c-a5ec-6a8b95d36137/disk --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:50:00 np0005546954 nova_compute[187160]: 2025-12-05 12:50:00.494 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/47ffc1ff-837b-495c-a5ec-6a8b95d36137/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:50:00 np0005546954 nova_compute[187160]: 2025-12-05 12:50:00.555 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/47ffc1ff-837b-495c-a5ec-6a8b95d36137/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:50:00 np0005546954 nova_compute[187160]: 2025-12-05 12:50:00.747 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:50:00 np0005546954 nova_compute[187160]: 2025-12-05 12:50:00.749 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5690MB free_disk=73.30718612670898GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:50:00 np0005546954 nova_compute[187160]: 2025-12-05 12:50:00.750 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:50:00 np0005546954 nova_compute[187160]: 2025-12-05 12:50:00.750 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:50:00 np0005546954 nova_compute[187160]: 2025-12-05 12:50:00.806 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:00 np0005546954 nova_compute[187160]: 2025-12-05 12:50:00.867 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Instance 47ffc1ff-837b-495c-a5ec-6a8b95d36137 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:50:00 np0005546954 nova_compute[187160]: 2025-12-05 12:50:00.868 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:50:00 np0005546954 nova_compute[187160]: 2025-12-05 12:50:00.868 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:50:00 np0005546954 nova_compute[187160]: 2025-12-05 12:50:00.999 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:50:01 np0005546954 nova_compute[187160]: 2025-12-05 12:50:01.019 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:50:01 np0005546954 nova_compute[187160]: 2025-12-05 12:50:01.042 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:50:01 np0005546954 nova_compute[187160]: 2025-12-05 12:50:01.043 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.292s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:50:01 np0005546954 nova_compute[187160]: 2025-12-05 12:50:01.809 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:50:01 np0005546954 nova_compute[187160]: 2025-12-05 12:50:01.812 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:50:01 np0005546954 nova_compute[187160]: 2025-12-05 12:50:01.812 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:50:03 np0005546954 podman[211479]: 2025-12-05 12:50:03.570407734 +0000 UTC m=+0.084071084 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, release=1755695350, architecture=x86_64, managed_by=edpm_ansible, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, distribution-scope=public, version=9.6, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7)
Dec  5 07:50:03 np0005546954 podman[211480]: 2025-12-05 12:50:03.594444962 +0000 UTC m=+0.094945377 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3)
Dec  5 07:50:03 np0005546954 nova_compute[187160]: 2025-12-05 12:50:03.693 187164 DEBUG nova.virt.libvirt.driver [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5] Creating tmpfile /var/lib/nova/instances/tmp05iui1vv to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Dec  5 07:50:03 np0005546954 nova_compute[187160]: 2025-12-05 12:50:03.695 187164 DEBUG nova.compute.manager [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp05iui1vv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Dec  5 07:50:04 np0005546954 nova_compute[187160]: 2025-12-05 12:50:04.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:50:04 np0005546954 nova_compute[187160]: 2025-12-05 12:50:04.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:50:04 np0005546954 nova_compute[187160]: 2025-12-05 12:50:04.677 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:05 np0005546954 nova_compute[187160]: 2025-12-05 12:50:05.198 187164 DEBUG nova.compute.manager [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp05iui1vv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Dec  5 07:50:05 np0005546954 nova_compute[187160]: 2025-12-05 12:50:05.229 187164 DEBUG oslo_concurrency.lockutils [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "refresh_cache-badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:50:05 np0005546954 nova_compute[187160]: 2025-12-05 12:50:05.229 187164 DEBUG oslo_concurrency.lockutils [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquired lock "refresh_cache-badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:50:05 np0005546954 nova_compute[187160]: 2025-12-05 12:50:05.230 187164 DEBUG nova.network.neutron [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:50:05 np0005546954 podman[197513]: time="2025-12-05T12:50:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:50:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:50:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  5 07:50:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:50:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3044 "" "Go-http-client/1.1"
Dec  5 07:50:05 np0005546954 nova_compute[187160]: 2025-12-05 12:50:05.809 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:06 np0005546954 nova_compute[187160]: 2025-12-05 12:50:06.145 187164 DEBUG nova.network.neutron [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5] Updating instance_info_cache with network_info: [{"id": "a3779d5b-46e5-441f-a30d-bb3473e5512d", "address": "fa:16:3e:46:8e:32", "network": {"id": "6e67014a-5792-45fa-ac5c-49089f8dc0ef", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1349109889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e0665b21e8d4fc092797e18e0320f99", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3779d5b-46", "ovs_interfaceid": "a3779d5b-46e5-441f-a30d-bb3473e5512d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:50:06 np0005546954 nova_compute[187160]: 2025-12-05 12:50:06.163 187164 DEBUG oslo_concurrency.lockutils [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Releasing lock "refresh_cache-badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:50:06 np0005546954 nova_compute[187160]: 2025-12-05 12:50:06.165 187164 DEBUG nova.virt.libvirt.driver [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp05iui1vv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Dec  5 07:50:06 np0005546954 nova_compute[187160]: 2025-12-05 12:50:06.166 187164 DEBUG nova.virt.libvirt.driver [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5] Creating instance directory: /var/lib/nova/instances/badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Dec  5 07:50:06 np0005546954 nova_compute[187160]: 2025-12-05 12:50:06.166 187164 DEBUG nova.virt.libvirt.driver [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5] Creating disk.info with the contents: {'/var/lib/nova/instances/badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5/disk': 'qcow2', '/var/lib/nova/instances/badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Dec  5 07:50:06 np0005546954 nova_compute[187160]: 2025-12-05 12:50:06.167 187164 DEBUG nova.virt.libvirt.driver [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Dec  5 07:50:06 np0005546954 nova_compute[187160]: 2025-12-05 12:50:06.167 187164 DEBUG nova.objects.instance [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lazy-loading 'trusted_certs' on Instance uuid badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:50:06 np0005546954 nova_compute[187160]: 2025-12-05 12:50:06.195 187164 DEBUG oslo_concurrency.processutils [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:50:06 np0005546954 nova_compute[187160]: 2025-12-05 12:50:06.276 187164 DEBUG oslo_concurrency.processutils [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:50:06 np0005546954 nova_compute[187160]: 2025-12-05 12:50:06.277 187164 DEBUG oslo_concurrency.lockutils [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:50:06 np0005546954 nova_compute[187160]: 2025-12-05 12:50:06.278 187164 DEBUG oslo_concurrency.lockutils [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:50:06 np0005546954 nova_compute[187160]: 2025-12-05 12:50:06.288 187164 DEBUG oslo_concurrency.processutils [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:50:06 np0005546954 nova_compute[187160]: 2025-12-05 12:50:06.346 187164 DEBUG oslo_concurrency.processutils [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:50:06 np0005546954 nova_compute[187160]: 2025-12-05 12:50:06.348 187164 DEBUG oslo_concurrency.processutils [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:50:06 np0005546954 nova_compute[187160]: 2025-12-05 12:50:06.385 187164 DEBUG oslo_concurrency.processutils [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:50:06 np0005546954 nova_compute[187160]: 2025-12-05 12:50:06.386 187164 DEBUG oslo_concurrency.lockutils [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:50:06 np0005546954 nova_compute[187160]: 2025-12-05 12:50:06.387 187164 DEBUG oslo_concurrency.processutils [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:50:06 np0005546954 nova_compute[187160]: 2025-12-05 12:50:06.447 187164 DEBUG oslo_concurrency.processutils [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:50:06 np0005546954 nova_compute[187160]: 2025-12-05 12:50:06.449 187164 DEBUG nova.virt.disk.api [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Checking if we can resize image /var/lib/nova/instances/badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:50:06 np0005546954 nova_compute[187160]: 2025-12-05 12:50:06.450 187164 DEBUG oslo_concurrency.processutils [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:50:06 np0005546954 nova_compute[187160]: 2025-12-05 12:50:06.527 187164 DEBUG oslo_concurrency.processutils [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:50:06 np0005546954 nova_compute[187160]: 2025-12-05 12:50:06.528 187164 DEBUG nova.virt.disk.api [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Cannot resize image /var/lib/nova/instances/badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:50:06 np0005546954 nova_compute[187160]: 2025-12-05 12:50:06.528 187164 DEBUG nova.objects.instance [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lazy-loading 'migration_context' on Instance uuid badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:50:06 np0005546954 nova_compute[187160]: 2025-12-05 12:50:06.549 187164 DEBUG oslo_concurrency.processutils [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:50:06 np0005546954 nova_compute[187160]: 2025-12-05 12:50:06.582 187164 DEBUG oslo_concurrency.processutils [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5/disk.config 485376" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:50:06 np0005546954 nova_compute[187160]: 2025-12-05 12:50:06.584 187164 DEBUG nova.virt.libvirt.volume.remotefs [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5/disk.config to /var/lib/nova/instances/badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Dec  5 07:50:06 np0005546954 nova_compute[187160]: 2025-12-05 12:50:06.584 187164 DEBUG oslo_concurrency.processutils [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5/disk.config /var/lib/nova/instances/badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:50:06 np0005546954 ovn_controller[95566]: 2025-12-05T12:50:06Z|00092|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Dec  5 07:50:07 np0005546954 nova_compute[187160]: 2025-12-05 12:50:07.026 187164 DEBUG oslo_concurrency.processutils [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5/disk.config /var/lib/nova/instances/badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:50:07 np0005546954 nova_compute[187160]: 2025-12-05 12:50:07.027 187164 DEBUG nova.virt.libvirt.driver [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Dec  5 07:50:07 np0005546954 nova_compute[187160]: 2025-12-05 12:50:07.029 187164 DEBUG nova.virt.libvirt.vif [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:49:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1753685923',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1753685923',id=9,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:49:24Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='7e0665b21e8d4fc092797e18e0320f99',ramdisk_id='',reservation_id='r-x7ucr6f7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-680209779',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-680209779-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:49:24Z,user_data=None,user_id='40d645c136824137bea297268f8a9cee',uuid=badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a3779d5b-46e5-441f-a30d-bb3473e5512d", "address": "fa:16:3e:46:8e:32", "network": {"id": "6e67014a-5792-45fa-ac5c-49089f8dc0ef", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1349109889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e0665b21e8d4fc092797e18e0320f99", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapa3779d5b-46", "ovs_interfaceid": "a3779d5b-46e5-441f-a30d-bb3473e5512d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:50:07 np0005546954 nova_compute[187160]: 2025-12-05 12:50:07.029 187164 DEBUG nova.network.os_vif_util [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Converting VIF {"id": "a3779d5b-46e5-441f-a30d-bb3473e5512d", "address": "fa:16:3e:46:8e:32", "network": {"id": "6e67014a-5792-45fa-ac5c-49089f8dc0ef", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1349109889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e0665b21e8d4fc092797e18e0320f99", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapa3779d5b-46", "ovs_interfaceid": "a3779d5b-46e5-441f-a30d-bb3473e5512d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:50:07 np0005546954 nova_compute[187160]: 2025-12-05 12:50:07.030 187164 DEBUG nova.network.os_vif_util [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:46:8e:32,bridge_name='br-int',has_traffic_filtering=True,id=a3779d5b-46e5-441f-a30d-bb3473e5512d,network=Network(6e67014a-5792-45fa-ac5c-49089f8dc0ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3779d5b-46') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:50:07 np0005546954 nova_compute[187160]: 2025-12-05 12:50:07.030 187164 DEBUG os_vif [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:46:8e:32,bridge_name='br-int',has_traffic_filtering=True,id=a3779d5b-46e5-441f-a30d-bb3473e5512d,network=Network(6e67014a-5792-45fa-ac5c-49089f8dc0ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3779d5b-46') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:50:07 np0005546954 nova_compute[187160]: 2025-12-05 12:50:07.031 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:07 np0005546954 nova_compute[187160]: 2025-12-05 12:50:07.032 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:50:07 np0005546954 nova_compute[187160]: 2025-12-05 12:50:07.032 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:50:07 np0005546954 nova_compute[187160]: 2025-12-05 12:50:07.035 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:07 np0005546954 nova_compute[187160]: 2025-12-05 12:50:07.035 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa3779d5b-46, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:50:07 np0005546954 nova_compute[187160]: 2025-12-05 12:50:07.036 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa3779d5b-46, col_values=(('external_ids', {'iface-id': 'a3779d5b-46e5-441f-a30d-bb3473e5512d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:46:8e:32', 'vm-uuid': 'badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:50:07 np0005546954 nova_compute[187160]: 2025-12-05 12:50:07.037 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:07 np0005546954 NetworkManager[55665]: <info>  [1764939007.0391] manager: (tapa3779d5b-46): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Dec  5 07:50:07 np0005546954 nova_compute[187160]: 2025-12-05 12:50:07.040 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:50:07 np0005546954 nova_compute[187160]: 2025-12-05 12:50:07.045 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:07 np0005546954 nova_compute[187160]: 2025-12-05 12:50:07.046 187164 INFO os_vif [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:46:8e:32,bridge_name='br-int',has_traffic_filtering=True,id=a3779d5b-46e5-441f-a30d-bb3473e5512d,network=Network(6e67014a-5792-45fa-ac5c-49089f8dc0ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3779d5b-46')#033[00m
Dec  5 07:50:07 np0005546954 nova_compute[187160]: 2025-12-05 12:50:07.047 187164 DEBUG nova.virt.libvirt.driver [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Dec  5 07:50:07 np0005546954 nova_compute[187160]: 2025-12-05 12:50:07.047 187164 DEBUG nova.compute.manager [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp05iui1vv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Dec  5 07:50:08 np0005546954 nova_compute[187160]: 2025-12-05 12:50:08.446 187164 DEBUG nova.network.neutron [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5] Port a3779d5b-46e5-441f-a30d-bb3473e5512d updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Dec  5 07:50:08 np0005546954 nova_compute[187160]: 2025-12-05 12:50:08.448 187164 DEBUG nova.compute.manager [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp05iui1vv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Dec  5 07:50:08 np0005546954 systemd[1]: Starting libvirt proxy daemon...
Dec  5 07:50:08 np0005546954 systemd[1]: Started libvirt proxy daemon.
Dec  5 07:50:08 np0005546954 kernel: tapa3779d5b-46: entered promiscuous mode
Dec  5 07:50:08 np0005546954 NetworkManager[55665]: <info>  [1764939008.7379] manager: (tapa3779d5b-46): new Tun device (/org/freedesktop/NetworkManager/Devices/45)
Dec  5 07:50:08 np0005546954 ovn_controller[95566]: 2025-12-05T12:50:08Z|00093|binding|INFO|Claiming lport a3779d5b-46e5-441f-a30d-bb3473e5512d for this additional chassis.
Dec  5 07:50:08 np0005546954 ovn_controller[95566]: 2025-12-05T12:50:08Z|00094|binding|INFO|a3779d5b-46e5-441f-a30d-bb3473e5512d: Claiming fa:16:3e:46:8e:32 10.100.0.14
Dec  5 07:50:08 np0005546954 nova_compute[187160]: 2025-12-05 12:50:08.742 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:08 np0005546954 ovn_controller[95566]: 2025-12-05T12:50:08Z|00095|binding|INFO|Setting lport a3779d5b-46e5-441f-a30d-bb3473e5512d ovn-installed in OVS
Dec  5 07:50:08 np0005546954 systemd-udevd[211574]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:50:08 np0005546954 nova_compute[187160]: 2025-12-05 12:50:08.766 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:08 np0005546954 nova_compute[187160]: 2025-12-05 12:50:08.770 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:08 np0005546954 NetworkManager[55665]: <info>  [1764939008.7811] device (tapa3779d5b-46): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:50:08 np0005546954 NetworkManager[55665]: <info>  [1764939008.7819] device (tapa3779d5b-46): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:50:08 np0005546954 systemd-machined[153497]: New machine qemu-9-instance-00000009.
Dec  5 07:50:08 np0005546954 systemd[1]: Started Virtual Machine qemu-9-instance-00000009.
Dec  5 07:50:09 np0005546954 nova_compute[187160]: 2025-12-05 12:50:09.678 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:10 np0005546954 nova_compute[187160]: 2025-12-05 12:50:10.408 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764939010.4079452, badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:50:10 np0005546954 nova_compute[187160]: 2025-12-05 12:50:10.409 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5] VM Started (Lifecycle Event)#033[00m
Dec  5 07:50:10 np0005546954 nova_compute[187160]: 2025-12-05 12:50:10.432 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:50:11 np0005546954 nova_compute[187160]: 2025-12-05 12:50:11.219 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764939011.2188063, badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:50:11 np0005546954 nova_compute[187160]: 2025-12-05 12:50:11.221 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:50:11 np0005546954 nova_compute[187160]: 2025-12-05 12:50:11.242 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:50:11 np0005546954 nova_compute[187160]: 2025-12-05 12:50:11.246 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:50:11 np0005546954 nova_compute[187160]: 2025-12-05 12:50:11.264 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Dec  5 07:50:12 np0005546954 nova_compute[187160]: 2025-12-05 12:50:12.038 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:12 np0005546954 nova_compute[187160]: 2025-12-05 12:50:12.517 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:12.516 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2a:56:46', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '32:90:88:ab:74:32'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:50:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:12.518 104428 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 07:50:12 np0005546954 ovn_controller[95566]: 2025-12-05T12:50:12Z|00096|binding|INFO|Claiming lport a3779d5b-46e5-441f-a30d-bb3473e5512d for this chassis.
Dec  5 07:50:12 np0005546954 ovn_controller[95566]: 2025-12-05T12:50:12Z|00097|binding|INFO|a3779d5b-46e5-441f-a30d-bb3473e5512d: Claiming fa:16:3e:46:8e:32 10.100.0.14
Dec  5 07:50:12 np0005546954 ovn_controller[95566]: 2025-12-05T12:50:12Z|00098|binding|INFO|Setting lport a3779d5b-46e5-441f-a30d-bb3473e5512d up in Southbound
Dec  5 07:50:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:12.552 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:46:8e:32 10.100.0.14'], port_security=['fa:16:3e:46:8e:32 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6e67014a-5792-45fa-ac5c-49089f8dc0ef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7e0665b21e8d4fc092797e18e0320f99', 'neutron:revision_number': '11', 'neutron:security_group_ids': '66fe94db-41ba-4d2d-b949-4aa77c13cd33', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8d4f604e-5305-48b4-8813-83692bab39ba, chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=a3779d5b-46e5-441f-a30d-bb3473e5512d) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:50:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:12.554 104428 INFO neutron.agent.ovn.metadata.agent [-] Port a3779d5b-46e5-441f-a30d-bb3473e5512d in datapath 6e67014a-5792-45fa-ac5c-49089f8dc0ef bound to our chassis#033[00m
Dec  5 07:50:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:12.556 104428 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6e67014a-5792-45fa-ac5c-49089f8dc0ef#033[00m
Dec  5 07:50:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:12.578 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[ac1e0142-bb60-4637-9ad8-10156883d428]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:50:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:12.617 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[7218a2b9-5ffd-4b67-a843-2a00e3be04d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:50:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:12.620 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[b909cf28-260c-4034-a716-d55449d06c78]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:50:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:12.650 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[95da6156-3dca-419b-b79a-965858ad1952]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:50:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:12.668 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[fb27007f-6bbe-4f65-b558-5ba4e323ed4a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6e67014a-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f0:72:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 6, 'rx_bytes': 826, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 6, 'rx_bytes': 826, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 398314, 'reachable_time': 30269, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211610, 'error': None, 'target': 'ovnmeta-6e67014a-5792-45fa-ac5c-49089f8dc0ef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:50:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:12.692 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[8aa7951e-e157-466a-a56a-ceb83f0863ea]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap6e67014a-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 398326, 'tstamp': 398326}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211611, 'error': None, 'target': 'ovnmeta-6e67014a-5792-45fa-ac5c-49089f8dc0ef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap6e67014a-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 398331, 'tstamp': 398331}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211611, 'error': None, 'target': 'ovnmeta-6e67014a-5792-45fa-ac5c-49089f8dc0ef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:50:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:12.694 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6e67014a-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:50:12 np0005546954 nova_compute[187160]: 2025-12-05 12:50:12.696 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:12 np0005546954 nova_compute[187160]: 2025-12-05 12:50:12.697 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:12.697 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6e67014a-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:50:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:12.698 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:50:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:12.698 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6e67014a-50, col_values=(('external_ids', {'iface-id': 'd323474c-8dc0-4d9e-a642-0592117c8324'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:50:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:12.698 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:50:12 np0005546954 nova_compute[187160]: 2025-12-05 12:50:12.748 187164 INFO nova.compute.manager [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5] Post operation of migration started#033[00m
Dec  5 07:50:12 np0005546954 nova_compute[187160]: 2025-12-05 12:50:12.967 187164 DEBUG oslo_concurrency.lockutils [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "refresh_cache-badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:50:12 np0005546954 nova_compute[187160]: 2025-12-05 12:50:12.968 187164 DEBUG oslo_concurrency.lockutils [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquired lock "refresh_cache-badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:50:12 np0005546954 nova_compute[187160]: 2025-12-05 12:50:12.968 187164 DEBUG nova.network.neutron [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:50:14 np0005546954 nova_compute[187160]: 2025-12-05 12:50:14.013 187164 DEBUG nova.network.neutron [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5] Updating instance_info_cache with network_info: [{"id": "a3779d5b-46e5-441f-a30d-bb3473e5512d", "address": "fa:16:3e:46:8e:32", "network": {"id": "6e67014a-5792-45fa-ac5c-49089f8dc0ef", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1349109889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e0665b21e8d4fc092797e18e0320f99", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3779d5b-46", "ovs_interfaceid": "a3779d5b-46e5-441f-a30d-bb3473e5512d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:50:14 np0005546954 nova_compute[187160]: 2025-12-05 12:50:14.040 187164 DEBUG oslo_concurrency.lockutils [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Releasing lock "refresh_cache-badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:50:14 np0005546954 nova_compute[187160]: 2025-12-05 12:50:14.058 187164 DEBUG oslo_concurrency.lockutils [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:50:14 np0005546954 nova_compute[187160]: 2025-12-05 12:50:14.059 187164 DEBUG oslo_concurrency.lockutils [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:50:14 np0005546954 nova_compute[187160]: 2025-12-05 12:50:14.059 187164 DEBUG oslo_concurrency.lockutils [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:50:14 np0005546954 nova_compute[187160]: 2025-12-05 12:50:14.065 187164 INFO nova.virt.libvirt.driver [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Dec  5 07:50:14 np0005546954 virtqemud[186730]: Domain id=9 name='instance-00000009' uuid=badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5 is tainted: custom-monitor
Dec  5 07:50:14 np0005546954 nova_compute[187160]: 2025-12-05 12:50:14.681 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:15 np0005546954 nova_compute[187160]: 2025-12-05 12:50:15.073 187164 INFO nova.virt.libvirt.driver [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Dec  5 07:50:16 np0005546954 nova_compute[187160]: 2025-12-05 12:50:16.080 187164 INFO nova.virt.libvirt.driver [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Dec  5 07:50:16 np0005546954 nova_compute[187160]: 2025-12-05 12:50:16.086 187164 DEBUG nova.compute.manager [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:50:16 np0005546954 nova_compute[187160]: 2025-12-05 12:50:16.107 187164 DEBUG nova.objects.instance [None req-555b8b8e-9eeb-4f95-be79-7ef96432724f 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Dec  5 07:50:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:16.947 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:50:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:16.948 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:50:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:16.949 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:50:17 np0005546954 nova_compute[187160]: 2025-12-05 12:50:17.042 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:17 np0005546954 podman[211612]: 2025-12-05 12:50:17.588134354 +0000 UTC m=+0.084356943 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec  5 07:50:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:50:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 07:50:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:50:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:50:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:50:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:50:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:50:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 07:50:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:50:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:50:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 07:50:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:50:19 np0005546954 nova_compute[187160]: 2025-12-05 12:50:19.684 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:20 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:20.521 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f9f74c-08f9-451f-9678-93bb9e8fa2fe, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:50:21 np0005546954 nova_compute[187160]: 2025-12-05 12:50:21.392 187164 DEBUG oslo_concurrency.lockutils [None req-95420fac-df86-4799-8b8d-4a36aad61f21 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Acquiring lock "47ffc1ff-837b-495c-a5ec-6a8b95d36137" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:50:21 np0005546954 nova_compute[187160]: 2025-12-05 12:50:21.392 187164 DEBUG oslo_concurrency.lockutils [None req-95420fac-df86-4799-8b8d-4a36aad61f21 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Lock "47ffc1ff-837b-495c-a5ec-6a8b95d36137" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:50:21 np0005546954 nova_compute[187160]: 2025-12-05 12:50:21.393 187164 DEBUG oslo_concurrency.lockutils [None req-95420fac-df86-4799-8b8d-4a36aad61f21 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Acquiring lock "47ffc1ff-837b-495c-a5ec-6a8b95d36137-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:50:21 np0005546954 nova_compute[187160]: 2025-12-05 12:50:21.393 187164 DEBUG oslo_concurrency.lockutils [None req-95420fac-df86-4799-8b8d-4a36aad61f21 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Lock "47ffc1ff-837b-495c-a5ec-6a8b95d36137-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:50:21 np0005546954 nova_compute[187160]: 2025-12-05 12:50:21.393 187164 DEBUG oslo_concurrency.lockutils [None req-95420fac-df86-4799-8b8d-4a36aad61f21 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Lock "47ffc1ff-837b-495c-a5ec-6a8b95d36137-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:50:21 np0005546954 nova_compute[187160]: 2025-12-05 12:50:21.394 187164 INFO nova.compute.manager [None req-95420fac-df86-4799-8b8d-4a36aad61f21 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Terminating instance#033[00m
Dec  5 07:50:21 np0005546954 nova_compute[187160]: 2025-12-05 12:50:21.395 187164 DEBUG nova.compute.manager [None req-95420fac-df86-4799-8b8d-4a36aad61f21 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:50:21 np0005546954 kernel: tap5a3cdfcf-aa (unregistering): left promiscuous mode
Dec  5 07:50:21 np0005546954 NetworkManager[55665]: <info>  [1764939021.4298] device (tap5a3cdfcf-aa): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:50:21 np0005546954 ovn_controller[95566]: 2025-12-05T12:50:21Z|00099|binding|INFO|Releasing lport 5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676 from this chassis (sb_readonly=0)
Dec  5 07:50:21 np0005546954 nova_compute[187160]: 2025-12-05 12:50:21.437 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:21 np0005546954 ovn_controller[95566]: 2025-12-05T12:50:21Z|00100|binding|INFO|Setting lport 5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676 down in Southbound
Dec  5 07:50:21 np0005546954 ovn_controller[95566]: 2025-12-05T12:50:21Z|00101|binding|INFO|Removing iface tap5a3cdfcf-aa ovn-installed in OVS
Dec  5 07:50:21 np0005546954 nova_compute[187160]: 2025-12-05 12:50:21.441 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:21 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:21.447 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:28:d9:1b 10.100.0.5'], port_security=['fa:16:3e:28:d9:1b 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '47ffc1ff-837b-495c-a5ec-6a8b95d36137', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6e67014a-5792-45fa-ac5c-49089f8dc0ef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7e0665b21e8d4fc092797e18e0320f99', 'neutron:revision_number': '4', 'neutron:security_group_ids': '66fe94db-41ba-4d2d-b949-4aa77c13cd33', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8d4f604e-5305-48b4-8813-83692bab39ba, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:50:21 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:21.450 104428 INFO neutron.agent.ovn.metadata.agent [-] Port 5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676 in datapath 6e67014a-5792-45fa-ac5c-49089f8dc0ef unbound from our chassis#033[00m
Dec  5 07:50:21 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:21.453 104428 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6e67014a-5792-45fa-ac5c-49089f8dc0ef#033[00m
Dec  5 07:50:21 np0005546954 nova_compute[187160]: 2025-12-05 12:50:21.455 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:21 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:21.481 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[37bf5376-d271-404f-b45e-e6eddbfe692b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:50:21 np0005546954 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Dec  5 07:50:21 np0005546954 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000a.scope: Consumed 14.007s CPU time.
Dec  5 07:50:21 np0005546954 systemd-machined[153497]: Machine qemu-8-instance-0000000a terminated.
Dec  5 07:50:21 np0005546954 podman[211635]: 2025-12-05 12:50:21.524962403 +0000 UTC m=+0.062623267 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  5 07:50:21 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:21.529 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[fef60edd-0a18-49cd-b8a5-bb385acda124]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:50:21 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:21.532 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[ceb2406c-85dd-475f-9fd7-cf485ce8c70b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:50:21 np0005546954 podman[211632]: 2025-12-05 12:50:21.562556309 +0000 UTC m=+0.099710577 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true)
Dec  5 07:50:21 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:21.565 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[fc664f02-99d5-423c-8b62-228f8e9dce29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:50:21 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:21.582 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[f79cd1de-248f-43ae-8000-1698573fb0fb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6e67014a-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f0:72:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 8, 'rx_bytes': 1456, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 8, 'rx_bytes': 1456, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 398314, 'reachable_time': 30269, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211692, 'error': None, 'target': 'ovnmeta-6e67014a-5792-45fa-ac5c-49089f8dc0ef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:50:21 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:21.596 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[6dfff713-c247-4d31-8ba6-6192db557e5c]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap6e67014a-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 398326, 'tstamp': 398326}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211693, 'error': None, 'target': 'ovnmeta-6e67014a-5792-45fa-ac5c-49089f8dc0ef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap6e67014a-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 398331, 'tstamp': 398331}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211693, 'error': None, 'target': 'ovnmeta-6e67014a-5792-45fa-ac5c-49089f8dc0ef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:50:21 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:21.598 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6e67014a-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:50:21 np0005546954 nova_compute[187160]: 2025-12-05 12:50:21.600 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:21 np0005546954 nova_compute[187160]: 2025-12-05 12:50:21.607 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:21 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:21.607 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6e67014a-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:50:21 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:21.608 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:50:21 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:21.608 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6e67014a-50, col_values=(('external_ids', {'iface-id': 'd323474c-8dc0-4d9e-a642-0592117c8324'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:50:21 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:21.609 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:50:21 np0005546954 nova_compute[187160]: 2025-12-05 12:50:21.619 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:21 np0005546954 nova_compute[187160]: 2025-12-05 12:50:21.624 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:21 np0005546954 nova_compute[187160]: 2025-12-05 12:50:21.668 187164 INFO nova.virt.libvirt.driver [-] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Instance destroyed successfully.#033[00m
Dec  5 07:50:21 np0005546954 nova_compute[187160]: 2025-12-05 12:50:21.669 187164 DEBUG nova.objects.instance [None req-95420fac-df86-4799-8b8d-4a36aad61f21 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Lazy-loading 'resources' on Instance uuid 47ffc1ff-837b-495c-a5ec-6a8b95d36137 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:50:21 np0005546954 nova_compute[187160]: 2025-12-05 12:50:21.682 187164 DEBUG nova.virt.libvirt.vif [None req-95420fac-df86-4799-8b8d-4a36aad61f21 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:49:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-799514834',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-799514834',id=10,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:49:37Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7e0665b21e8d4fc092797e18e0320f99',ramdisk_id='',reservation_id='r-731kb9gv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-680209779',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-680209779-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:49:37Z,user_data=None,user_id='40d645c136824137bea297268f8a9cee',uuid=47ffc1ff-837b-495c-a5ec-6a8b95d36137,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676", "address": "fa:16:3e:28:d9:1b", "network": {"id": "6e67014a-5792-45fa-ac5c-49089f8dc0ef", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1349109889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e0665b21e8d4fc092797e18e0320f99", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5a3cdfcf-aa", "ovs_interfaceid": "5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:50:21 np0005546954 nova_compute[187160]: 2025-12-05 12:50:21.682 187164 DEBUG nova.network.os_vif_util [None req-95420fac-df86-4799-8b8d-4a36aad61f21 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Converting VIF {"id": "5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676", "address": "fa:16:3e:28:d9:1b", "network": {"id": "6e67014a-5792-45fa-ac5c-49089f8dc0ef", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1349109889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e0665b21e8d4fc092797e18e0320f99", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5a3cdfcf-aa", "ovs_interfaceid": "5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:50:21 np0005546954 nova_compute[187160]: 2025-12-05 12:50:21.683 187164 DEBUG nova.network.os_vif_util [None req-95420fac-df86-4799-8b8d-4a36aad61f21 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:28:d9:1b,bridge_name='br-int',has_traffic_filtering=True,id=5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676,network=Network(6e67014a-5792-45fa-ac5c-49089f8dc0ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5a3cdfcf-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:50:21 np0005546954 nova_compute[187160]: 2025-12-05 12:50:21.683 187164 DEBUG os_vif [None req-95420fac-df86-4799-8b8d-4a36aad61f21 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:28:d9:1b,bridge_name='br-int',has_traffic_filtering=True,id=5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676,network=Network(6e67014a-5792-45fa-ac5c-49089f8dc0ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5a3cdfcf-aa') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:50:21 np0005546954 nova_compute[187160]: 2025-12-05 12:50:21.685 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:21 np0005546954 nova_compute[187160]: 2025-12-05 12:50:21.685 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5a3cdfcf-aa, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:50:21 np0005546954 nova_compute[187160]: 2025-12-05 12:50:21.687 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:21 np0005546954 nova_compute[187160]: 2025-12-05 12:50:21.688 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:21 np0005546954 nova_compute[187160]: 2025-12-05 12:50:21.692 187164 INFO os_vif [None req-95420fac-df86-4799-8b8d-4a36aad61f21 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:28:d9:1b,bridge_name='br-int',has_traffic_filtering=True,id=5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676,network=Network(6e67014a-5792-45fa-ac5c-49089f8dc0ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5a3cdfcf-aa')#033[00m
Dec  5 07:50:21 np0005546954 nova_compute[187160]: 2025-12-05 12:50:21.693 187164 INFO nova.virt.libvirt.driver [None req-95420fac-df86-4799-8b8d-4a36aad61f21 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Deleting instance files /var/lib/nova/instances/47ffc1ff-837b-495c-a5ec-6a8b95d36137_del#033[00m
Dec  5 07:50:21 np0005546954 nova_compute[187160]: 2025-12-05 12:50:21.693 187164 INFO nova.virt.libvirt.driver [None req-95420fac-df86-4799-8b8d-4a36aad61f21 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Deletion of /var/lib/nova/instances/47ffc1ff-837b-495c-a5ec-6a8b95d36137_del complete#033[00m
Dec  5 07:50:21 np0005546954 nova_compute[187160]: 2025-12-05 12:50:21.744 187164 INFO nova.compute.manager [None req-95420fac-df86-4799-8b8d-4a36aad61f21 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Took 0.35 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:50:21 np0005546954 nova_compute[187160]: 2025-12-05 12:50:21.745 187164 DEBUG oslo.service.loopingcall [None req-95420fac-df86-4799-8b8d-4a36aad61f21 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:50:21 np0005546954 nova_compute[187160]: 2025-12-05 12:50:21.745 187164 DEBUG nova.compute.manager [-] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:50:21 np0005546954 nova_compute[187160]: 2025-12-05 12:50:21.745 187164 DEBUG nova.network.neutron [-] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:50:24 np0005546954 nova_compute[187160]: 2025-12-05 12:50:24.546 187164 DEBUG nova.compute.manager [req-a0c4fb8b-edd0-41df-8606-f5097752656b req-92a9180a-7696-4c1c-997e-e94c38a6fac8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Received event network-vif-unplugged-5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:50:24 np0005546954 nova_compute[187160]: 2025-12-05 12:50:24.546 187164 DEBUG oslo_concurrency.lockutils [req-a0c4fb8b-edd0-41df-8606-f5097752656b req-92a9180a-7696-4c1c-997e-e94c38a6fac8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "47ffc1ff-837b-495c-a5ec-6a8b95d36137-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:50:24 np0005546954 nova_compute[187160]: 2025-12-05 12:50:24.547 187164 DEBUG oslo_concurrency.lockutils [req-a0c4fb8b-edd0-41df-8606-f5097752656b req-92a9180a-7696-4c1c-997e-e94c38a6fac8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "47ffc1ff-837b-495c-a5ec-6a8b95d36137-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:50:24 np0005546954 nova_compute[187160]: 2025-12-05 12:50:24.547 187164 DEBUG oslo_concurrency.lockutils [req-a0c4fb8b-edd0-41df-8606-f5097752656b req-92a9180a-7696-4c1c-997e-e94c38a6fac8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "47ffc1ff-837b-495c-a5ec-6a8b95d36137-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:50:24 np0005546954 nova_compute[187160]: 2025-12-05 12:50:24.548 187164 DEBUG nova.compute.manager [req-a0c4fb8b-edd0-41df-8606-f5097752656b req-92a9180a-7696-4c1c-997e-e94c38a6fac8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] No waiting events found dispatching network-vif-unplugged-5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:50:24 np0005546954 nova_compute[187160]: 2025-12-05 12:50:24.548 187164 DEBUG nova.compute.manager [req-a0c4fb8b-edd0-41df-8606-f5097752656b req-92a9180a-7696-4c1c-997e-e94c38a6fac8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Received event network-vif-unplugged-5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  5 07:50:24 np0005546954 nova_compute[187160]: 2025-12-05 12:50:24.685 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:24 np0005546954 nova_compute[187160]: 2025-12-05 12:50:24.690 187164 DEBUG nova.network.neutron [-] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:50:24 np0005546954 nova_compute[187160]: 2025-12-05 12:50:24.711 187164 INFO nova.compute.manager [-] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Took 2.97 seconds to deallocate network for instance.#033[00m
Dec  5 07:50:24 np0005546954 nova_compute[187160]: 2025-12-05 12:50:24.748 187164 DEBUG oslo_concurrency.lockutils [None req-95420fac-df86-4799-8b8d-4a36aad61f21 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:50:24 np0005546954 nova_compute[187160]: 2025-12-05 12:50:24.748 187164 DEBUG oslo_concurrency.lockutils [None req-95420fac-df86-4799-8b8d-4a36aad61f21 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:50:24 np0005546954 nova_compute[187160]: 2025-12-05 12:50:24.832 187164 DEBUG nova.compute.provider_tree [None req-95420fac-df86-4799-8b8d-4a36aad61f21 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:50:24 np0005546954 nova_compute[187160]: 2025-12-05 12:50:24.852 187164 DEBUG nova.scheduler.client.report [None req-95420fac-df86-4799-8b8d-4a36aad61f21 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:50:24 np0005546954 nova_compute[187160]: 2025-12-05 12:50:24.869 187164 DEBUG oslo_concurrency.lockutils [None req-95420fac-df86-4799-8b8d-4a36aad61f21 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:50:24 np0005546954 nova_compute[187160]: 2025-12-05 12:50:24.899 187164 INFO nova.scheduler.client.report [None req-95420fac-df86-4799-8b8d-4a36aad61f21 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Deleted allocations for instance 47ffc1ff-837b-495c-a5ec-6a8b95d36137#033[00m
Dec  5 07:50:24 np0005546954 nova_compute[187160]: 2025-12-05 12:50:24.971 187164 DEBUG oslo_concurrency.lockutils [None req-95420fac-df86-4799-8b8d-4a36aad61f21 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Lock "47ffc1ff-837b-495c-a5ec-6a8b95d36137" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.579s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:50:25 np0005546954 nova_compute[187160]: 2025-12-05 12:50:25.974 187164 DEBUG oslo_concurrency.lockutils [None req-a1e770b6-67c9-4225-a423-ba00e5cd1846 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Acquiring lock "badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:50:25 np0005546954 nova_compute[187160]: 2025-12-05 12:50:25.975 187164 DEBUG oslo_concurrency.lockutils [None req-a1e770b6-67c9-4225-a423-ba00e5cd1846 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Lock "badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:50:25 np0005546954 nova_compute[187160]: 2025-12-05 12:50:25.975 187164 DEBUG oslo_concurrency.lockutils [None req-a1e770b6-67c9-4225-a423-ba00e5cd1846 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Acquiring lock "badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:50:25 np0005546954 nova_compute[187160]: 2025-12-05 12:50:25.976 187164 DEBUG oslo_concurrency.lockutils [None req-a1e770b6-67c9-4225-a423-ba00e5cd1846 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Lock "badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:50:25 np0005546954 nova_compute[187160]: 2025-12-05 12:50:25.976 187164 DEBUG oslo_concurrency.lockutils [None req-a1e770b6-67c9-4225-a423-ba00e5cd1846 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Lock "badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:50:25 np0005546954 nova_compute[187160]: 2025-12-05 12:50:25.977 187164 INFO nova.compute.manager [None req-a1e770b6-67c9-4225-a423-ba00e5cd1846 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] [instance: badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5] Terminating instance#033[00m
Dec  5 07:50:25 np0005546954 nova_compute[187160]: 2025-12-05 12:50:25.979 187164 DEBUG nova.compute.manager [None req-a1e770b6-67c9-4225-a423-ba00e5cd1846 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] [instance: badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:50:25 np0005546954 kernel: tapa3779d5b-46 (unregistering): left promiscuous mode
Dec  5 07:50:26 np0005546954 NetworkManager[55665]: <info>  [1764939026.0003] device (tapa3779d5b-46): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:50:26 np0005546954 nova_compute[187160]: 2025-12-05 12:50:26.011 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:26 np0005546954 ovn_controller[95566]: 2025-12-05T12:50:26Z|00102|binding|INFO|Releasing lport a3779d5b-46e5-441f-a30d-bb3473e5512d from this chassis (sb_readonly=0)
Dec  5 07:50:26 np0005546954 ovn_controller[95566]: 2025-12-05T12:50:26Z|00103|binding|INFO|Setting lport a3779d5b-46e5-441f-a30d-bb3473e5512d down in Southbound
Dec  5 07:50:26 np0005546954 ovn_controller[95566]: 2025-12-05T12:50:26Z|00104|binding|INFO|Removing iface tapa3779d5b-46 ovn-installed in OVS
Dec  5 07:50:26 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:26.018 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:46:8e:32 10.100.0.14'], port_security=['fa:16:3e:46:8e:32 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6e67014a-5792-45fa-ac5c-49089f8dc0ef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7e0665b21e8d4fc092797e18e0320f99', 'neutron:revision_number': '13', 'neutron:security_group_ids': '66fe94db-41ba-4d2d-b949-4aa77c13cd33', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8d4f604e-5305-48b4-8813-83692bab39ba, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=a3779d5b-46e5-441f-a30d-bb3473e5512d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:50:26 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:26.019 104428 INFO neutron.agent.ovn.metadata.agent [-] Port a3779d5b-46e5-441f-a30d-bb3473e5512d in datapath 6e67014a-5792-45fa-ac5c-49089f8dc0ef unbound from our chassis#033[00m
Dec  5 07:50:26 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:26.020 104428 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6e67014a-5792-45fa-ac5c-49089f8dc0ef, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:50:26 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:26.022 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[e1a41c44-ae6d-4665-8bd9-7dabe9ecd2a6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:50:26 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:26.022 104428 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6e67014a-5792-45fa-ac5c-49089f8dc0ef namespace which is not needed anymore#033[00m
Dec  5 07:50:26 np0005546954 nova_compute[187160]: 2025-12-05 12:50:26.033 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:26 np0005546954 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000009.scope: Deactivated successfully.
Dec  5 07:50:26 np0005546954 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000009.scope: Consumed 2.699s CPU time.
Dec  5 07:50:26 np0005546954 systemd-machined[153497]: Machine qemu-9-instance-00000009 terminated.
Dec  5 07:50:26 np0005546954 neutron-haproxy-ovnmeta-6e67014a-5792-45fa-ac5c-49089f8dc0ef[211368]: [NOTICE]   (211372) : haproxy version is 2.8.14-c23fe91
Dec  5 07:50:26 np0005546954 neutron-haproxy-ovnmeta-6e67014a-5792-45fa-ac5c-49089f8dc0ef[211368]: [NOTICE]   (211372) : path to executable is /usr/sbin/haproxy
Dec  5 07:50:26 np0005546954 neutron-haproxy-ovnmeta-6e67014a-5792-45fa-ac5c-49089f8dc0ef[211368]: [WARNING]  (211372) : Exiting Master process...
Dec  5 07:50:26 np0005546954 neutron-haproxy-ovnmeta-6e67014a-5792-45fa-ac5c-49089f8dc0ef[211368]: [ALERT]    (211372) : Current worker (211374) exited with code 143 (Terminated)
Dec  5 07:50:26 np0005546954 neutron-haproxy-ovnmeta-6e67014a-5792-45fa-ac5c-49089f8dc0ef[211368]: [WARNING]  (211372) : All workers exited. Exiting... (0)
Dec  5 07:50:26 np0005546954 systemd[1]: libpod-fb11ee4d36c5611686c37d5b182f222fe8d860c79694c5dea8b69fb41fa25565.scope: Deactivated successfully.
Dec  5 07:50:26 np0005546954 podman[211734]: 2025-12-05 12:50:26.176268778 +0000 UTC m=+0.059748346 container died fb11ee4d36c5611686c37d5b182f222fe8d860c79694c5dea8b69fb41fa25565 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6e67014a-5792-45fa-ac5c-49089f8dc0ef, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS)
Dec  5 07:50:26 np0005546954 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fb11ee4d36c5611686c37d5b182f222fe8d860c79694c5dea8b69fb41fa25565-userdata-shm.mount: Deactivated successfully.
Dec  5 07:50:26 np0005546954 systemd[1]: var-lib-containers-storage-overlay-e363d86e4112ed9454aec6ec80cfad5faaf10edcfdf7140c6fbd2e5573060ba5-merged.mount: Deactivated successfully.
Dec  5 07:50:26 np0005546954 podman[211734]: 2025-12-05 12:50:26.23651554 +0000 UTC m=+0.119995118 container cleanup fb11ee4d36c5611686c37d5b182f222fe8d860c79694c5dea8b69fb41fa25565 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6e67014a-5792-45fa-ac5c-49089f8dc0ef, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:50:26 np0005546954 systemd[1]: libpod-conmon-fb11ee4d36c5611686c37d5b182f222fe8d860c79694c5dea8b69fb41fa25565.scope: Deactivated successfully.
Dec  5 07:50:26 np0005546954 nova_compute[187160]: 2025-12-05 12:50:26.254 187164 INFO nova.virt.libvirt.driver [-] [instance: badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5] Instance destroyed successfully.#033[00m
Dec  5 07:50:26 np0005546954 nova_compute[187160]: 2025-12-05 12:50:26.255 187164 DEBUG nova.objects.instance [None req-a1e770b6-67c9-4225-a423-ba00e5cd1846 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Lazy-loading 'resources' on Instance uuid badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:50:26 np0005546954 nova_compute[187160]: 2025-12-05 12:50:26.267 187164 DEBUG nova.virt.libvirt.vif [None req-a1e770b6-67c9-4225-a423-ba00e5cd1846 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T12:49:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1753685923',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1753685923',id=9,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:49:24Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7e0665b21e8d4fc092797e18e0320f99',ramdisk_id='',reservation_id='r-x7ucr6f7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-680209779',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-680209779-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:50:16Z,user_data=None,user_id='40d645c136824137bea297268f8a9cee',uuid=badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a3779d5b-46e5-441f-a30d-bb3473e5512d", "address": "fa:16:3e:46:8e:32", "network": {"id": "6e67014a-5792-45fa-ac5c-49089f8dc0ef", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1349109889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e0665b21e8d4fc092797e18e0320f99", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3779d5b-46", "ovs_interfaceid": "a3779d5b-46e5-441f-a30d-bb3473e5512d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:50:26 np0005546954 nova_compute[187160]: 2025-12-05 12:50:26.268 187164 DEBUG nova.network.os_vif_util [None req-a1e770b6-67c9-4225-a423-ba00e5cd1846 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Converting VIF {"id": "a3779d5b-46e5-441f-a30d-bb3473e5512d", "address": "fa:16:3e:46:8e:32", "network": {"id": "6e67014a-5792-45fa-ac5c-49089f8dc0ef", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1349109889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e0665b21e8d4fc092797e18e0320f99", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3779d5b-46", "ovs_interfaceid": "a3779d5b-46e5-441f-a30d-bb3473e5512d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:50:26 np0005546954 nova_compute[187160]: 2025-12-05 12:50:26.269 187164 DEBUG nova.network.os_vif_util [None req-a1e770b6-67c9-4225-a423-ba00e5cd1846 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:46:8e:32,bridge_name='br-int',has_traffic_filtering=True,id=a3779d5b-46e5-441f-a30d-bb3473e5512d,network=Network(6e67014a-5792-45fa-ac5c-49089f8dc0ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3779d5b-46') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:50:26 np0005546954 nova_compute[187160]: 2025-12-05 12:50:26.269 187164 DEBUG os_vif [None req-a1e770b6-67c9-4225-a423-ba00e5cd1846 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:46:8e:32,bridge_name='br-int',has_traffic_filtering=True,id=a3779d5b-46e5-441f-a30d-bb3473e5512d,network=Network(6e67014a-5792-45fa-ac5c-49089f8dc0ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3779d5b-46') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:50:26 np0005546954 nova_compute[187160]: 2025-12-05 12:50:26.271 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:26 np0005546954 nova_compute[187160]: 2025-12-05 12:50:26.271 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa3779d5b-46, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:50:26 np0005546954 nova_compute[187160]: 2025-12-05 12:50:26.272 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:26 np0005546954 nova_compute[187160]: 2025-12-05 12:50:26.274 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:26 np0005546954 nova_compute[187160]: 2025-12-05 12:50:26.276 187164 INFO os_vif [None req-a1e770b6-67c9-4225-a423-ba00e5cd1846 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:46:8e:32,bridge_name='br-int',has_traffic_filtering=True,id=a3779d5b-46e5-441f-a30d-bb3473e5512d,network=Network(6e67014a-5792-45fa-ac5c-49089f8dc0ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3779d5b-46')#033[00m
Dec  5 07:50:26 np0005546954 nova_compute[187160]: 2025-12-05 12:50:26.277 187164 INFO nova.virt.libvirt.driver [None req-a1e770b6-67c9-4225-a423-ba00e5cd1846 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] [instance: badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5] Deleting instance files /var/lib/nova/instances/badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5_del#033[00m
Dec  5 07:50:26 np0005546954 nova_compute[187160]: 2025-12-05 12:50:26.278 187164 INFO nova.virt.libvirt.driver [None req-a1e770b6-67c9-4225-a423-ba00e5cd1846 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] [instance: badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5] Deletion of /var/lib/nova/instances/badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5_del complete#033[00m
Dec  5 07:50:26 np0005546954 podman[211781]: 2025-12-05 12:50:26.312183488 +0000 UTC m=+0.048499702 container remove fb11ee4d36c5611686c37d5b182f222fe8d860c79694c5dea8b69fb41fa25565 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6e67014a-5792-45fa-ac5c-49089f8dc0ef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3)
Dec  5 07:50:26 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:26.317 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[49f5b904-1732-4034-b569-bcdda937d438]: (4, ('Fri Dec  5 12:50:26 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-6e67014a-5792-45fa-ac5c-49089f8dc0ef (fb11ee4d36c5611686c37d5b182f222fe8d860c79694c5dea8b69fb41fa25565)\nfb11ee4d36c5611686c37d5b182f222fe8d860c79694c5dea8b69fb41fa25565\nFri Dec  5 12:50:26 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6e67014a-5792-45fa-ac5c-49089f8dc0ef (fb11ee4d36c5611686c37d5b182f222fe8d860c79694c5dea8b69fb41fa25565)\nfb11ee4d36c5611686c37d5b182f222fe8d860c79694c5dea8b69fb41fa25565\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:50:26 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:26.319 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[9162b72b-c82e-49c5-98fe-625af66752b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:50:26 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:26.320 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6e67014a-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:50:26 np0005546954 nova_compute[187160]: 2025-12-05 12:50:26.322 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:26 np0005546954 kernel: tap6e67014a-50: left promiscuous mode
Dec  5 07:50:26 np0005546954 nova_compute[187160]: 2025-12-05 12:50:26.324 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:26 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:26.327 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[b0ea5938-4a1e-429c-b26c-c1eb1dbfd521]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:50:26 np0005546954 nova_compute[187160]: 2025-12-05 12:50:26.337 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:26 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:26.346 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[9f6c1729-f88a-4147-bb8c-88e9214d2e50]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:50:26 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:26.347 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[e86a60b2-393a-48b9-bb1d-2b38731c8c6c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:50:26 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:26.362 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[58c5b46e-8cc5-42cd-9d60-1711337f2a79]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 398305, 'reachable_time': 32023, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211797, 'error': None, 'target': 'ovnmeta-6e67014a-5792-45fa-ac5c-49089f8dc0ef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:50:26 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:26.366 104542 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6e67014a-5792-45fa-ac5c-49089f8dc0ef deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:50:26 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:50:26.366 104542 DEBUG oslo.privsep.daemon [-] privsep: reply[6491ba81-5295-4cac-9122-2f1988f74413]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:50:26 np0005546954 systemd[1]: run-netns-ovnmeta\x2d6e67014a\x2d5792\x2d45fa\x2dac5c\x2d49089f8dc0ef.mount: Deactivated successfully.
Dec  5 07:50:26 np0005546954 nova_compute[187160]: 2025-12-05 12:50:26.395 187164 INFO nova.compute.manager [None req-a1e770b6-67c9-4225-a423-ba00e5cd1846 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] [instance: badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5] Took 0.42 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:50:26 np0005546954 nova_compute[187160]: 2025-12-05 12:50:26.396 187164 DEBUG oslo.service.loopingcall [None req-a1e770b6-67c9-4225-a423-ba00e5cd1846 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:50:26 np0005546954 nova_compute[187160]: 2025-12-05 12:50:26.396 187164 DEBUG nova.compute.manager [-] [instance: badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:50:26 np0005546954 nova_compute[187160]: 2025-12-05 12:50:26.396 187164 DEBUG nova.network.neutron [-] [instance: badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:50:26 np0005546954 nova_compute[187160]: 2025-12-05 12:50:26.634 187164 DEBUG nova.compute.manager [req-ebc7572e-96b5-4572-9e8e-ce2343fa15a3 req-27a605a3-2e56-4da4-97e0-5e05fb38279d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Received event network-vif-plugged-5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:50:26 np0005546954 nova_compute[187160]: 2025-12-05 12:50:26.635 187164 DEBUG oslo_concurrency.lockutils [req-ebc7572e-96b5-4572-9e8e-ce2343fa15a3 req-27a605a3-2e56-4da4-97e0-5e05fb38279d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "47ffc1ff-837b-495c-a5ec-6a8b95d36137-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:50:26 np0005546954 nova_compute[187160]: 2025-12-05 12:50:26.636 187164 DEBUG oslo_concurrency.lockutils [req-ebc7572e-96b5-4572-9e8e-ce2343fa15a3 req-27a605a3-2e56-4da4-97e0-5e05fb38279d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "47ffc1ff-837b-495c-a5ec-6a8b95d36137-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:50:26 np0005546954 nova_compute[187160]: 2025-12-05 12:50:26.637 187164 DEBUG oslo_concurrency.lockutils [req-ebc7572e-96b5-4572-9e8e-ce2343fa15a3 req-27a605a3-2e56-4da4-97e0-5e05fb38279d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "47ffc1ff-837b-495c-a5ec-6a8b95d36137-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:50:26 np0005546954 nova_compute[187160]: 2025-12-05 12:50:26.637 187164 DEBUG nova.compute.manager [req-ebc7572e-96b5-4572-9e8e-ce2343fa15a3 req-27a605a3-2e56-4da4-97e0-5e05fb38279d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] No waiting events found dispatching network-vif-plugged-5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:50:26 np0005546954 nova_compute[187160]: 2025-12-05 12:50:26.638 187164 WARNING nova.compute.manager [req-ebc7572e-96b5-4572-9e8e-ce2343fa15a3 req-27a605a3-2e56-4da4-97e0-5e05fb38279d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Received unexpected event network-vif-plugged-5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:50:26 np0005546954 nova_compute[187160]: 2025-12-05 12:50:26.638 187164 DEBUG nova.compute.manager [req-ebc7572e-96b5-4572-9e8e-ce2343fa15a3 req-27a605a3-2e56-4da4-97e0-5e05fb38279d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Received event network-vif-deleted-5a3cdfcf-aa7e-4b1c-9a0d-e5fc3169a676 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:50:27 np0005546954 nova_compute[187160]: 2025-12-05 12:50:27.009 187164 DEBUG nova.network.neutron [-] [instance: badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:50:27 np0005546954 nova_compute[187160]: 2025-12-05 12:50:27.025 187164 INFO nova.compute.manager [-] [instance: badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5] Took 0.63 seconds to deallocate network for instance.#033[00m
Dec  5 07:50:27 np0005546954 nova_compute[187160]: 2025-12-05 12:50:27.097 187164 DEBUG oslo_concurrency.lockutils [None req-a1e770b6-67c9-4225-a423-ba00e5cd1846 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:50:27 np0005546954 nova_compute[187160]: 2025-12-05 12:50:27.097 187164 DEBUG oslo_concurrency.lockutils [None req-a1e770b6-67c9-4225-a423-ba00e5cd1846 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:50:27 np0005546954 nova_compute[187160]: 2025-12-05 12:50:27.102 187164 DEBUG oslo_concurrency.lockutils [None req-a1e770b6-67c9-4225-a423-ba00e5cd1846 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:50:27 np0005546954 nova_compute[187160]: 2025-12-05 12:50:27.123 187164 INFO nova.scheduler.client.report [None req-a1e770b6-67c9-4225-a423-ba00e5cd1846 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Deleted allocations for instance badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5#033[00m
Dec  5 07:50:27 np0005546954 nova_compute[187160]: 2025-12-05 12:50:27.188 187164 DEBUG oslo_concurrency.lockutils [None req-a1e770b6-67c9-4225-a423-ba00e5cd1846 40d645c136824137bea297268f8a9cee 7e0665b21e8d4fc092797e18e0320f99 - - default default] Lock "badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.213s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:50:28 np0005546954 nova_compute[187160]: 2025-12-05 12:50:28.738 187164 DEBUG nova.compute.manager [req-d4101f0a-e8f3-4cbd-affe-33eb6b78fac7 req-fc60d602-4821-44b8-b51a-dc6d2f5407d7 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5] Received event network-vif-unplugged-a3779d5b-46e5-441f-a30d-bb3473e5512d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:50:28 np0005546954 nova_compute[187160]: 2025-12-05 12:50:28.739 187164 DEBUG oslo_concurrency.lockutils [req-d4101f0a-e8f3-4cbd-affe-33eb6b78fac7 req-fc60d602-4821-44b8-b51a-dc6d2f5407d7 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:50:28 np0005546954 nova_compute[187160]: 2025-12-05 12:50:28.739 187164 DEBUG oslo_concurrency.lockutils [req-d4101f0a-e8f3-4cbd-affe-33eb6b78fac7 req-fc60d602-4821-44b8-b51a-dc6d2f5407d7 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:50:28 np0005546954 nova_compute[187160]: 2025-12-05 12:50:28.739 187164 DEBUG oslo_concurrency.lockutils [req-d4101f0a-e8f3-4cbd-affe-33eb6b78fac7 req-fc60d602-4821-44b8-b51a-dc6d2f5407d7 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:50:28 np0005546954 nova_compute[187160]: 2025-12-05 12:50:28.739 187164 DEBUG nova.compute.manager [req-d4101f0a-e8f3-4cbd-affe-33eb6b78fac7 req-fc60d602-4821-44b8-b51a-dc6d2f5407d7 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5] No waiting events found dispatching network-vif-unplugged-a3779d5b-46e5-441f-a30d-bb3473e5512d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:50:28 np0005546954 nova_compute[187160]: 2025-12-05 12:50:28.740 187164 WARNING nova.compute.manager [req-d4101f0a-e8f3-4cbd-affe-33eb6b78fac7 req-fc60d602-4821-44b8-b51a-dc6d2f5407d7 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5] Received unexpected event network-vif-unplugged-a3779d5b-46e5-441f-a30d-bb3473e5512d for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:50:28 np0005546954 nova_compute[187160]: 2025-12-05 12:50:28.740 187164 DEBUG nova.compute.manager [req-d4101f0a-e8f3-4cbd-affe-33eb6b78fac7 req-fc60d602-4821-44b8-b51a-dc6d2f5407d7 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5] Received event network-vif-plugged-a3779d5b-46e5-441f-a30d-bb3473e5512d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:50:28 np0005546954 nova_compute[187160]: 2025-12-05 12:50:28.740 187164 DEBUG oslo_concurrency.lockutils [req-d4101f0a-e8f3-4cbd-affe-33eb6b78fac7 req-fc60d602-4821-44b8-b51a-dc6d2f5407d7 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:50:28 np0005546954 nova_compute[187160]: 2025-12-05 12:50:28.740 187164 DEBUG oslo_concurrency.lockutils [req-d4101f0a-e8f3-4cbd-affe-33eb6b78fac7 req-fc60d602-4821-44b8-b51a-dc6d2f5407d7 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:50:28 np0005546954 nova_compute[187160]: 2025-12-05 12:50:28.741 187164 DEBUG oslo_concurrency.lockutils [req-d4101f0a-e8f3-4cbd-affe-33eb6b78fac7 req-fc60d602-4821-44b8-b51a-dc6d2f5407d7 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:50:28 np0005546954 nova_compute[187160]: 2025-12-05 12:50:28.741 187164 DEBUG nova.compute.manager [req-d4101f0a-e8f3-4cbd-affe-33eb6b78fac7 req-fc60d602-4821-44b8-b51a-dc6d2f5407d7 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5] No waiting events found dispatching network-vif-plugged-a3779d5b-46e5-441f-a30d-bb3473e5512d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:50:28 np0005546954 nova_compute[187160]: 2025-12-05 12:50:28.741 187164 WARNING nova.compute.manager [req-d4101f0a-e8f3-4cbd-affe-33eb6b78fac7 req-fc60d602-4821-44b8-b51a-dc6d2f5407d7 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5] Received unexpected event network-vif-plugged-a3779d5b-46e5-441f-a30d-bb3473e5512d for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:50:28 np0005546954 nova_compute[187160]: 2025-12-05 12:50:28.741 187164 DEBUG nova.compute.manager [req-d4101f0a-e8f3-4cbd-affe-33eb6b78fac7 req-fc60d602-4821-44b8-b51a-dc6d2f5407d7 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5] Received event network-vif-deleted-a3779d5b-46e5-441f-a30d-bb3473e5512d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:50:29 np0005546954 nova_compute[187160]: 2025-12-05 12:50:29.687 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:31 np0005546954 nova_compute[187160]: 2025-12-05 12:50:31.274 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:34 np0005546954 podman[211800]: 2025-12-05 12:50:34.566297752 +0000 UTC m=+0.064708653 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_id=edpm, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, managed_by=edpm_ansible, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.expose-services=, name=ubi9-minimal)
Dec  5 07:50:34 np0005546954 podman[211801]: 2025-12-05 12:50:34.57095693 +0000 UTC m=+0.065359744 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible)
Dec  5 07:50:34 np0005546954 nova_compute[187160]: 2025-12-05 12:50:34.690 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:35 np0005546954 podman[197513]: time="2025-12-05T12:50:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:50:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:50:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 07:50:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:50:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2588 "" "Go-http-client/1.1"
Dec  5 07:50:36 np0005546954 nova_compute[187160]: 2025-12-05 12:50:36.278 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:36 np0005546954 nova_compute[187160]: 2025-12-05 12:50:36.668 187164 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764939021.6660385, 47ffc1ff-837b-495c-a5ec-6a8b95d36137 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:50:36 np0005546954 nova_compute[187160]: 2025-12-05 12:50:36.669 187164 INFO nova.compute.manager [-] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:50:36 np0005546954 nova_compute[187160]: 2025-12-05 12:50:36.707 187164 DEBUG nova.compute.manager [None req-9168a25b-5255-4e89-bdd3-1da1772b8c6c - - - - - -] [instance: 47ffc1ff-837b-495c-a5ec-6a8b95d36137] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:50:39 np0005546954 nova_compute[187160]: 2025-12-05 12:50:39.692 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:41 np0005546954 nova_compute[187160]: 2025-12-05 12:50:41.254 187164 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764939026.252075, badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:50:41 np0005546954 nova_compute[187160]: 2025-12-05 12:50:41.254 187164 INFO nova.compute.manager [-] [instance: badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:50:41 np0005546954 nova_compute[187160]: 2025-12-05 12:50:41.273 187164 DEBUG nova.compute.manager [None req-55ac3a7f-6385-41a0-b515-5d5e32fb4e7c - - - - - -] [instance: badd8ba9-d6c6-4c1a-bbe4-a0fef4a7a6a5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:50:41 np0005546954 nova_compute[187160]: 2025-12-05 12:50:41.281 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:44 np0005546954 nova_compute[187160]: 2025-12-05 12:50:44.694 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:46 np0005546954 nova_compute[187160]: 2025-12-05 12:50:46.284 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:48 np0005546954 podman[211840]: 2025-12-05 12:50:48.561981818 +0000 UTC m=+0.059211220 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec  5 07:50:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:50:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 07:50:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:50:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:50:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:50:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:50:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:50:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 07:50:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:50:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:50:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 07:50:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:50:49 np0005546954 nova_compute[187160]: 2025-12-05 12:50:49.695 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:51 np0005546954 nova_compute[187160]: 2025-12-05 12:50:51.287 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:52 np0005546954 podman[211860]: 2025-12-05 12:50:52.550020014 +0000 UTC m=+0.047422208 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 07:50:52 np0005546954 podman[211859]: 2025-12-05 12:50:52.577221212 +0000 UTC m=+0.074181262 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Dec  5 07:50:54 np0005546954 nova_compute[187160]: 2025-12-05 12:50:54.699 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:55 np0005546954 nova_compute[187160]: 2025-12-05 12:50:55.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:50:55 np0005546954 nova_compute[187160]: 2025-12-05 12:50:55.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:50:55 np0005546954 nova_compute[187160]: 2025-12-05 12:50:55.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:50:55 np0005546954 nova_compute[187160]: 2025-12-05 12:50:55.060 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 07:50:55 np0005546954 nova_compute[187160]: 2025-12-05 12:50:55.061 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:50:56 np0005546954 nova_compute[187160]: 2025-12-05 12:50:56.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:50:56 np0005546954 nova_compute[187160]: 2025-12-05 12:50:56.290 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:50:57 np0005546954 ovn_controller[95566]: 2025-12-05T12:50:57Z|00105|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Dec  5 07:50:59 np0005546954 nova_compute[187160]: 2025-12-05 12:50:59.036 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:50:59 np0005546954 nova_compute[187160]: 2025-12-05 12:50:59.051 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:50:59 np0005546954 nova_compute[187160]: 2025-12-05 12:50:59.702 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:51:01 np0005546954 nova_compute[187160]: 2025-12-05 12:51:01.293 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:51:02 np0005546954 nova_compute[187160]: 2025-12-05 12:51:02.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:51:02 np0005546954 nova_compute[187160]: 2025-12-05 12:51:02.122 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:51:02 np0005546954 nova_compute[187160]: 2025-12-05 12:51:02.123 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:51:02 np0005546954 nova_compute[187160]: 2025-12-05 12:51:02.123 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:51:02 np0005546954 nova_compute[187160]: 2025-12-05 12:51:02.123 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:51:02 np0005546954 nova_compute[187160]: 2025-12-05 12:51:02.354 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:51:02 np0005546954 nova_compute[187160]: 2025-12-05 12:51:02.357 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5870MB free_disk=73.33633804321289GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:51:02 np0005546954 nova_compute[187160]: 2025-12-05 12:51:02.357 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:51:02 np0005546954 nova_compute[187160]: 2025-12-05 12:51:02.357 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:51:02 np0005546954 nova_compute[187160]: 2025-12-05 12:51:02.437 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:51:02 np0005546954 nova_compute[187160]: 2025-12-05 12:51:02.438 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:51:02 np0005546954 nova_compute[187160]: 2025-12-05 12:51:02.461 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:51:02 np0005546954 nova_compute[187160]: 2025-12-05 12:51:02.480 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:51:02 np0005546954 nova_compute[187160]: 2025-12-05 12:51:02.501 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:51:02 np0005546954 nova_compute[187160]: 2025-12-05 12:51:02.502 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:51:02 np0005546954 nova_compute[187160]: 2025-12-05 12:51:02.638 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:51:03 np0005546954 nova_compute[187160]: 2025-12-05 12:51:03.498 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:51:03 np0005546954 nova_compute[187160]: 2025-12-05 12:51:03.499 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:51:03 np0005546954 nova_compute[187160]: 2025-12-05 12:51:03.499 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:51:04 np0005546954 nova_compute[187160]: 2025-12-05 12:51:04.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:51:04 np0005546954 nova_compute[187160]: 2025-12-05 12:51:04.704 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:51:05 np0005546954 nova_compute[187160]: 2025-12-05 12:51:05.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:51:05 np0005546954 podman[211907]: 2025-12-05 12:51:05.55639344 +0000 UTC m=+0.059727808 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec  5 07:51:05 np0005546954 podman[211906]: 2025-12-05 12:51:05.561597061 +0000 UTC m=+0.068600054 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=9.6, name=ubi9-minimal, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, config_id=edpm, container_name=openstack_network_exporter, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, distribution-scope=public, architecture=x86_64, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec  5 07:51:05 np0005546954 podman[197513]: time="2025-12-05T12:51:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:51:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:51:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 07:51:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:51:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2581 "" "Go-http-client/1.1"
Dec  5 07:51:06 np0005546954 nova_compute[187160]: 2025-12-05 12:51:06.297 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:51:09 np0005546954 nova_compute[187160]: 2025-12-05 12:51:09.706 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:51:11 np0005546954 nova_compute[187160]: 2025-12-05 12:51:11.301 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:51:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:51:12.822 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2a:56:46', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '32:90:88:ab:74:32'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:51:12 np0005546954 nova_compute[187160]: 2025-12-05 12:51:12.823 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:51:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:51:12.824 104428 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 07:51:14 np0005546954 nova_compute[187160]: 2025-12-05 12:51:14.709 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:51:15 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:51:15.826 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f9f74c-08f9-451f-9678-93bb9e8fa2fe, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:51:16 np0005546954 nova_compute[187160]: 2025-12-05 12:51:16.304 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:51:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:51:16.948 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:51:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:51:16.949 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:51:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:51:16.949 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:51:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:51:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:51:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:51:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:51:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:51:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 07:51:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:51:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 07:51:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:51:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:51:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 07:51:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:51:19 np0005546954 podman[211945]: 2025-12-05 12:51:19.572067452 +0000 UTC m=+0.081427102 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec  5 07:51:19 np0005546954 nova_compute[187160]: 2025-12-05 12:51:19.711 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:51:21 np0005546954 nova_compute[187160]: 2025-12-05 12:51:21.307 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:51:23 np0005546954 podman[211965]: 2025-12-05 12:51:23.589810296 +0000 UTC m=+0.077215461 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 07:51:23 np0005546954 podman[211964]: 2025-12-05 12:51:23.591142128 +0000 UTC m=+0.097777320 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec  5 07:51:24 np0005546954 nova_compute[187160]: 2025-12-05 12:51:24.713 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:51:26 np0005546954 nova_compute[187160]: 2025-12-05 12:51:26.310 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:51:29 np0005546954 nova_compute[187160]: 2025-12-05 12:51:29.716 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:51:31 np0005546954 nova_compute[187160]: 2025-12-05 12:51:31.314 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:51:34 np0005546954 nova_compute[187160]: 2025-12-05 12:51:34.718 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:51:35 np0005546954 podman[197513]: time="2025-12-05T12:51:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:51:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:51:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 07:51:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:51:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2581 "" "Go-http-client/1.1"
Dec  5 07:51:36 np0005546954 nova_compute[187160]: 2025-12-05 12:51:36.318 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:51:36 np0005546954 podman[212014]: 2025-12-05 12:51:36.55978358 +0000 UTC m=+0.068731528 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-type=git, release=1755695350, name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, distribution-scope=public, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, architecture=x86_64, maintainer=Red Hat, Inc.)
Dec  5 07:51:36 np0005546954 podman[212015]: 2025-12-05 12:51:36.586220202 +0000 UTC m=+0.078765100 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec  5 07:51:36 np0005546954 ovn_controller[95566]: 2025-12-05T12:51:36Z|00106|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Dec  5 07:51:39 np0005546954 nova_compute[187160]: 2025-12-05 12:51:39.720 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:51:41 np0005546954 nova_compute[187160]: 2025-12-05 12:51:41.321 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:51:44 np0005546954 nova_compute[187160]: 2025-12-05 12:51:44.722 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:51:46 np0005546954 nova_compute[187160]: 2025-12-05 12:51:46.324 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:51:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:51:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:51:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:51:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:51:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:51:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 07:51:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:51:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 07:51:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:51:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:51:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 07:51:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:51:49 np0005546954 nova_compute[187160]: 2025-12-05 12:51:49.723 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:51:50 np0005546954 podman[212057]: 2025-12-05 12:51:50.547489993 +0000 UTC m=+0.060337246 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  5 07:51:51 np0005546954 nova_compute[187160]: 2025-12-05 12:51:51.327 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:51:54 np0005546954 podman[212078]: 2025-12-05 12:51:54.544206584 +0000 UTC m=+0.056394595 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  5 07:51:54 np0005546954 podman[212077]: 2025-12-05 12:51:54.62030458 +0000 UTC m=+0.133638206 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true)
Dec  5 07:51:54 np0005546954 nova_compute[187160]: 2025-12-05 12:51:54.725 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:51:56 np0005546954 nova_compute[187160]: 2025-12-05 12:51:56.330 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:51:57 np0005546954 nova_compute[187160]: 2025-12-05 12:51:57.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:51:57 np0005546954 nova_compute[187160]: 2025-12-05 12:51:57.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:51:57 np0005546954 nova_compute[187160]: 2025-12-05 12:51:57.041 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:51:57 np0005546954 nova_compute[187160]: 2025-12-05 12:51:57.059 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 07:51:57 np0005546954 nova_compute[187160]: 2025-12-05 12:51:57.060 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:51:58 np0005546954 nova_compute[187160]: 2025-12-05 12:51:58.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:51:59 np0005546954 nova_compute[187160]: 2025-12-05 12:51:59.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:51:59 np0005546954 nova_compute[187160]: 2025-12-05 12:51:59.726 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:52:01 np0005546954 nova_compute[187160]: 2025-12-05 12:52:01.368 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:52:04 np0005546954 nova_compute[187160]: 2025-12-05 12:52:04.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:52:04 np0005546954 nova_compute[187160]: 2025-12-05 12:52:04.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:52:04 np0005546954 nova_compute[187160]: 2025-12-05 12:52:04.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:52:04 np0005546954 nova_compute[187160]: 2025-12-05 12:52:04.069 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:52:04 np0005546954 nova_compute[187160]: 2025-12-05 12:52:04.070 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:52:04 np0005546954 nova_compute[187160]: 2025-12-05 12:52:04.070 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:52:04 np0005546954 nova_compute[187160]: 2025-12-05 12:52:04.070 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:52:04 np0005546954 nova_compute[187160]: 2025-12-05 12:52:04.266 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:52:04 np0005546954 nova_compute[187160]: 2025-12-05 12:52:04.267 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5878MB free_disk=73.33637619018555GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:52:04 np0005546954 nova_compute[187160]: 2025-12-05 12:52:04.268 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:52:04 np0005546954 nova_compute[187160]: 2025-12-05 12:52:04.268 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:52:04 np0005546954 nova_compute[187160]: 2025-12-05 12:52:04.367 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:52:04 np0005546954 nova_compute[187160]: 2025-12-05 12:52:04.367 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:52:04 np0005546954 nova_compute[187160]: 2025-12-05 12:52:04.404 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:52:04 np0005546954 nova_compute[187160]: 2025-12-05 12:52:04.426 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:52:04 np0005546954 nova_compute[187160]: 2025-12-05 12:52:04.428 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:52:04 np0005546954 nova_compute[187160]: 2025-12-05 12:52:04.428 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:52:04 np0005546954 nova_compute[187160]: 2025-12-05 12:52:04.778 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:52:05 np0005546954 nova_compute[187160]: 2025-12-05 12:52:05.423 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:52:05 np0005546954 podman[197513]: time="2025-12-05T12:52:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:52:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:52:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 07:52:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:52:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2578 "" "Go-http-client/1.1"
Dec  5 07:52:06 np0005546954 nova_compute[187160]: 2025-12-05 12:52:06.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:52:06 np0005546954 nova_compute[187160]: 2025-12-05 12:52:06.371 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:52:07 np0005546954 nova_compute[187160]: 2025-12-05 12:52:07.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:52:07 np0005546954 podman[212128]: 2025-12-05 12:52:07.579674924 +0000 UTC m=+0.085846889 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec  5 07:52:07 np0005546954 podman[212127]: 2025-12-05 12:52:07.58823381 +0000 UTC m=+0.091946309 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, config_id=edpm, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec  5 07:52:09 np0005546954 nova_compute[187160]: 2025-12-05 12:52:09.780 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:52:11 np0005546954 nova_compute[187160]: 2025-12-05 12:52:11.374 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:52:11 np0005546954 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec  5 07:52:14 np0005546954 nova_compute[187160]: 2025-12-05 12:52:14.819 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:52:14 np0005546954 nova_compute[187160]: 2025-12-05 12:52:14.902 187164 DEBUG oslo_concurrency.lockutils [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "d8c06fde-c17f-425f-a419-71aa3687ce9d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:52:14 np0005546954 nova_compute[187160]: 2025-12-05 12:52:14.903 187164 DEBUG oslo_concurrency.lockutils [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "d8c06fde-c17f-425f-a419-71aa3687ce9d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:52:14 np0005546954 nova_compute[187160]: 2025-12-05 12:52:14.920 187164 DEBUG nova.compute.manager [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:52:14 np0005546954 nova_compute[187160]: 2025-12-05 12:52:14.987 187164 DEBUG oslo_concurrency.lockutils [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:52:14 np0005546954 nova_compute[187160]: 2025-12-05 12:52:14.987 187164 DEBUG oslo_concurrency.lockutils [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:52:14 np0005546954 nova_compute[187160]: 2025-12-05 12:52:14.995 187164 DEBUG nova.virt.hardware [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:52:14 np0005546954 nova_compute[187160]: 2025-12-05 12:52:14.996 187164 INFO nova.compute.claims [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Claim successful on node compute-1.ctlplane.example.com#033[00m
Dec  5 07:52:15 np0005546954 nova_compute[187160]: 2025-12-05 12:52:15.098 187164 DEBUG nova.compute.provider_tree [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:52:15 np0005546954 nova_compute[187160]: 2025-12-05 12:52:15.117 187164 DEBUG nova.scheduler.client.report [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:52:15 np0005546954 nova_compute[187160]: 2025-12-05 12:52:15.148 187164 DEBUG oslo_concurrency.lockutils [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:52:15 np0005546954 nova_compute[187160]: 2025-12-05 12:52:15.149 187164 DEBUG nova.compute.manager [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:52:15 np0005546954 nova_compute[187160]: 2025-12-05 12:52:15.196 187164 DEBUG nova.compute.manager [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:52:15 np0005546954 nova_compute[187160]: 2025-12-05 12:52:15.197 187164 DEBUG nova.network.neutron [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:52:15 np0005546954 nova_compute[187160]: 2025-12-05 12:52:15.218 187164 INFO nova.virt.libvirt.driver [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:52:15 np0005546954 nova_compute[187160]: 2025-12-05 12:52:15.234 187164 DEBUG nova.compute.manager [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:52:15 np0005546954 nova_compute[187160]: 2025-12-05 12:52:15.331 187164 DEBUG nova.compute.manager [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:52:15 np0005546954 nova_compute[187160]: 2025-12-05 12:52:15.333 187164 DEBUG nova.virt.libvirt.driver [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:52:15 np0005546954 nova_compute[187160]: 2025-12-05 12:52:15.333 187164 INFO nova.virt.libvirt.driver [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Creating image(s)#033[00m
Dec  5 07:52:15 np0005546954 nova_compute[187160]: 2025-12-05 12:52:15.334 187164 DEBUG oslo_concurrency.lockutils [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "/var/lib/nova/instances/d8c06fde-c17f-425f-a419-71aa3687ce9d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:52:15 np0005546954 nova_compute[187160]: 2025-12-05 12:52:15.334 187164 DEBUG oslo_concurrency.lockutils [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "/var/lib/nova/instances/d8c06fde-c17f-425f-a419-71aa3687ce9d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:52:15 np0005546954 nova_compute[187160]: 2025-12-05 12:52:15.335 187164 DEBUG oslo_concurrency.lockutils [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "/var/lib/nova/instances/d8c06fde-c17f-425f-a419-71aa3687ce9d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:52:15 np0005546954 nova_compute[187160]: 2025-12-05 12:52:15.346 187164 DEBUG oslo_concurrency.processutils [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:52:15 np0005546954 nova_compute[187160]: 2025-12-05 12:52:15.378 187164 DEBUG nova.policy [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0ae0bb20ac8b4be99eb1abddc7310436', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e6ae0d0dcde04b85b6dae45560cca988', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:52:15 np0005546954 nova_compute[187160]: 2025-12-05 12:52:15.403 187164 DEBUG oslo_concurrency.processutils [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:52:15 np0005546954 nova_compute[187160]: 2025-12-05 12:52:15.404 187164 DEBUG oslo_concurrency.lockutils [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:52:15 np0005546954 nova_compute[187160]: 2025-12-05 12:52:15.405 187164 DEBUG oslo_concurrency.lockutils [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:52:15 np0005546954 nova_compute[187160]: 2025-12-05 12:52:15.420 187164 DEBUG oslo_concurrency.processutils [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:52:15 np0005546954 nova_compute[187160]: 2025-12-05 12:52:15.484 187164 DEBUG oslo_concurrency.processutils [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:52:15 np0005546954 nova_compute[187160]: 2025-12-05 12:52:15.486 187164 DEBUG oslo_concurrency.processutils [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/d8c06fde-c17f-425f-a419-71aa3687ce9d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:52:15 np0005546954 nova_compute[187160]: 2025-12-05 12:52:15.528 187164 DEBUG oslo_concurrency.processutils [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/d8c06fde-c17f-425f-a419-71aa3687ce9d/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:52:15 np0005546954 nova_compute[187160]: 2025-12-05 12:52:15.529 187164 DEBUG oslo_concurrency.lockutils [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:52:15 np0005546954 nova_compute[187160]: 2025-12-05 12:52:15.530 187164 DEBUG oslo_concurrency.processutils [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:52:15 np0005546954 nova_compute[187160]: 2025-12-05 12:52:15.599 187164 DEBUG oslo_concurrency.processutils [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:52:15 np0005546954 nova_compute[187160]: 2025-12-05 12:52:15.600 187164 DEBUG nova.virt.disk.api [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Checking if we can resize image /var/lib/nova/instances/d8c06fde-c17f-425f-a419-71aa3687ce9d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:52:15 np0005546954 nova_compute[187160]: 2025-12-05 12:52:15.601 187164 DEBUG oslo_concurrency.processutils [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d8c06fde-c17f-425f-a419-71aa3687ce9d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:52:15 np0005546954 nova_compute[187160]: 2025-12-05 12:52:15.668 187164 DEBUG oslo_concurrency.processutils [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d8c06fde-c17f-425f-a419-71aa3687ce9d/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:52:15 np0005546954 nova_compute[187160]: 2025-12-05 12:52:15.669 187164 DEBUG nova.virt.disk.api [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Cannot resize image /var/lib/nova/instances/d8c06fde-c17f-425f-a419-71aa3687ce9d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:52:15 np0005546954 nova_compute[187160]: 2025-12-05 12:52:15.670 187164 DEBUG nova.objects.instance [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lazy-loading 'migration_context' on Instance uuid d8c06fde-c17f-425f-a419-71aa3687ce9d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:52:15 np0005546954 nova_compute[187160]: 2025-12-05 12:52:15.685 187164 DEBUG nova.virt.libvirt.driver [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:52:15 np0005546954 nova_compute[187160]: 2025-12-05 12:52:15.686 187164 DEBUG nova.virt.libvirt.driver [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Ensure instance console log exists: /var/lib/nova/instances/d8c06fde-c17f-425f-a419-71aa3687ce9d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:52:15 np0005546954 nova_compute[187160]: 2025-12-05 12:52:15.686 187164 DEBUG oslo_concurrency.lockutils [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:52:15 np0005546954 nova_compute[187160]: 2025-12-05 12:52:15.687 187164 DEBUG oslo_concurrency.lockutils [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:52:15 np0005546954 nova_compute[187160]: 2025-12-05 12:52:15.687 187164 DEBUG oslo_concurrency.lockutils [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:52:15 np0005546954 nova_compute[187160]: 2025-12-05 12:52:15.922 187164 DEBUG nova.network.neutron [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Successfully created port: d430a736-69d8-44c8-a904-c0dd5d569dd7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:52:16 np0005546954 nova_compute[187160]: 2025-12-05 12:52:16.377 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:52:16 np0005546954 nova_compute[187160]: 2025-12-05 12:52:16.677 187164 DEBUG nova.network.neutron [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Successfully updated port: d430a736-69d8-44c8-a904-c0dd5d569dd7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:52:16 np0005546954 nova_compute[187160]: 2025-12-05 12:52:16.701 187164 DEBUG oslo_concurrency.lockutils [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "refresh_cache-d8c06fde-c17f-425f-a419-71aa3687ce9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:52:16 np0005546954 nova_compute[187160]: 2025-12-05 12:52:16.702 187164 DEBUG oslo_concurrency.lockutils [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquired lock "refresh_cache-d8c06fde-c17f-425f-a419-71aa3687ce9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:52:16 np0005546954 nova_compute[187160]: 2025-12-05 12:52:16.702 187164 DEBUG nova.network.neutron [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:52:16 np0005546954 nova_compute[187160]: 2025-12-05 12:52:16.802 187164 DEBUG nova.compute.manager [req-c9aeb5a5-e806-4510-b6f6-4598a4df469f req-7f7a2bd5-399e-4113-984c-2ede8ee0dd3a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Received event network-changed-d430a736-69d8-44c8-a904-c0dd5d569dd7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:52:16 np0005546954 nova_compute[187160]: 2025-12-05 12:52:16.803 187164 DEBUG nova.compute.manager [req-c9aeb5a5-e806-4510-b6f6-4598a4df469f req-7f7a2bd5-399e-4113-984c-2ede8ee0dd3a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Refreshing instance network info cache due to event network-changed-d430a736-69d8-44c8-a904-c0dd5d569dd7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:52:16 np0005546954 nova_compute[187160]: 2025-12-05 12:52:16.803 187164 DEBUG oslo_concurrency.lockutils [req-c9aeb5a5-e806-4510-b6f6-4598a4df469f req-7f7a2bd5-399e-4113-984c-2ede8ee0dd3a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "refresh_cache-d8c06fde-c17f-425f-a419-71aa3687ce9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:52:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:52:16.951 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:52:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:52:16.953 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:52:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:52:16.953 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:52:17 np0005546954 nova_compute[187160]: 2025-12-05 12:52:17.502 187164 DEBUG nova.network.neutron [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.807 187164 DEBUG nova.network.neutron [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Updating instance_info_cache with network_info: [{"id": "d430a736-69d8-44c8-a904-c0dd5d569dd7", "address": "fa:16:3e:57:12:ee", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd430a736-69", "ovs_interfaceid": "d430a736-69d8-44c8-a904-c0dd5d569dd7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.830 187164 DEBUG oslo_concurrency.lockutils [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Releasing lock "refresh_cache-d8c06fde-c17f-425f-a419-71aa3687ce9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.831 187164 DEBUG nova.compute.manager [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Instance network_info: |[{"id": "d430a736-69d8-44c8-a904-c0dd5d569dd7", "address": "fa:16:3e:57:12:ee", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd430a736-69", "ovs_interfaceid": "d430a736-69d8-44c8-a904-c0dd5d569dd7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.832 187164 DEBUG oslo_concurrency.lockutils [req-c9aeb5a5-e806-4510-b6f6-4598a4df469f req-7f7a2bd5-399e-4113-984c-2ede8ee0dd3a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquired lock "refresh_cache-d8c06fde-c17f-425f-a419-71aa3687ce9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.832 187164 DEBUG nova.network.neutron [req-c9aeb5a5-e806-4510-b6f6-4598a4df469f req-7f7a2bd5-399e-4113-984c-2ede8ee0dd3a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Refreshing network info cache for port d430a736-69d8-44c8-a904-c0dd5d569dd7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.838 187164 DEBUG nova.virt.libvirt.driver [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Start _get_guest_xml network_info=[{"id": "d430a736-69d8-44c8-a904-c0dd5d569dd7", "address": "fa:16:3e:57:12:ee", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd430a736-69", "ovs_interfaceid": "d430a736-69d8-44c8-a904-c0dd5d569dd7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T12:39:17Z,direct_url=<?>,disk_format='qcow2',id=f4c3125a-6fd0-40bb-aa00-a7e736ee853d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='83916c53de6f404f91206339303e1b23',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T12:39:21Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'size': 0, 'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'encrypted': False, 'image_id': 'f4c3125a-6fd0-40bb-aa00-a7e736ee853d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.845 187164 WARNING nova.virt.libvirt.driver [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.851 187164 DEBUG nova.virt.libvirt.host [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.852 187164 DEBUG nova.virt.libvirt.host [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.857 187164 DEBUG nova.virt.libvirt.host [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.858 187164 DEBUG nova.virt.libvirt.host [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.860 187164 DEBUG nova.virt.libvirt.driver [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.861 187164 DEBUG nova.virt.hardware [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T12:39:15Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b4ea63be-97f8-4a48-b000-66321c4ddb27',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T12:39:17Z,direct_url=<?>,disk_format='qcow2',id=f4c3125a-6fd0-40bb-aa00-a7e736ee853d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='83916c53de6f404f91206339303e1b23',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T12:39:21Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.862 187164 DEBUG nova.virt.hardware [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.862 187164 DEBUG nova.virt.hardware [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.863 187164 DEBUG nova.virt.hardware [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.863 187164 DEBUG nova.virt.hardware [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.864 187164 DEBUG nova.virt.hardware [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.865 187164 DEBUG nova.virt.hardware [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.865 187164 DEBUG nova.virt.hardware [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.866 187164 DEBUG nova.virt.hardware [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.866 187164 DEBUG nova.virt.hardware [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.866 187164 DEBUG nova.virt.hardware [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.875 187164 DEBUG nova.virt.libvirt.vif [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:52:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-452631592',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-452631592',id=11,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e6ae0d0dcde04b85b6dae45560cca988',ramdisk_id='',reservation_id='r-5q5hgc3u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-192029678',owner_user_name='tempest-TestExecuteStrategies-192029678-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:52:15Z,user_data=None,user_id='0ae0bb20ac8b4be99eb1abddc7310436',uuid=d8c06fde-c17f-425f-a419-71aa3687ce9d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d430a736-69d8-44c8-a904-c0dd5d569dd7", "address": "fa:16:3e:57:12:ee", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd430a736-69", "ovs_interfaceid": "d430a736-69d8-44c8-a904-c0dd5d569dd7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.875 187164 DEBUG nova.network.os_vif_util [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converting VIF {"id": "d430a736-69d8-44c8-a904-c0dd5d569dd7", "address": "fa:16:3e:57:12:ee", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd430a736-69", "ovs_interfaceid": "d430a736-69d8-44c8-a904-c0dd5d569dd7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.877 187164 DEBUG nova.network.os_vif_util [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:57:12:ee,bridge_name='br-int',has_traffic_filtering=True,id=d430a736-69d8-44c8-a904-c0dd5d569dd7,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd430a736-69') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.879 187164 DEBUG nova.objects.instance [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lazy-loading 'pci_devices' on Instance uuid d8c06fde-c17f-425f-a419-71aa3687ce9d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.896 187164 DEBUG nova.virt.libvirt.driver [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:52:18 np0005546954 nova_compute[187160]:  <uuid>d8c06fde-c17f-425f-a419-71aa3687ce9d</uuid>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:  <name>instance-0000000b</name>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:  <memory>131072</memory>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:  <vcpu>1</vcpu>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:  <metadata>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:52:18 np0005546954 nova_compute[187160]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:      <nova:name>tempest-TestExecuteStrategies-server-452631592</nova:name>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:      <nova:creationTime>2025-12-05 12:52:18</nova:creationTime>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:      <nova:flavor name="m1.nano">
Dec  5 07:52:18 np0005546954 nova_compute[187160]:        <nova:memory>128</nova:memory>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:        <nova:disk>1</nova:disk>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:        <nova:swap>0</nova:swap>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:      </nova:flavor>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:      <nova:owner>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:        <nova:user uuid="0ae0bb20ac8b4be99eb1abddc7310436">tempest-TestExecuteStrategies-192029678-project-member</nova:user>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:        <nova:project uuid="e6ae0d0dcde04b85b6dae45560cca988">tempest-TestExecuteStrategies-192029678</nova:project>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:      </nova:owner>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:      <nova:root type="image" uuid="f4c3125a-6fd0-40bb-aa00-a7e736ee853d"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:      <nova:ports>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:        <nova:port uuid="d430a736-69d8-44c8-a904-c0dd5d569dd7">
Dec  5 07:52:18 np0005546954 nova_compute[187160]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:        </nova:port>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:      </nova:ports>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    </nova:instance>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:  </metadata>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:  <sysinfo type="smbios">
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <system>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:      <entry name="serial">d8c06fde-c17f-425f-a419-71aa3687ce9d</entry>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:      <entry name="uuid">d8c06fde-c17f-425f-a419-71aa3687ce9d</entry>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    </system>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:  </sysinfo>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:  <os>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <boot dev="hd"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <smbios mode="sysinfo"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:  </os>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:  <features>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <acpi/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <apic/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <vmcoreinfo/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:  </features>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:  <clock offset="utc">
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <timer name="hpet" present="no"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:  </clock>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:  <cpu mode="custom" match="exact">
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <model>Nehalem</model>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:  </cpu>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:  <devices>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <disk type="file" device="disk">
Dec  5 07:52:18 np0005546954 nova_compute[187160]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:      <source file="/var/lib/nova/instances/d8c06fde-c17f-425f-a419-71aa3687ce9d/disk"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:      <target dev="vda" bus="virtio"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    </disk>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <disk type="file" device="cdrom">
Dec  5 07:52:18 np0005546954 nova_compute[187160]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:      <source file="/var/lib/nova/instances/d8c06fde-c17f-425f-a419-71aa3687ce9d/disk.config"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:      <target dev="sda" bus="sata"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    </disk>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <interface type="ethernet">
Dec  5 07:52:18 np0005546954 nova_compute[187160]:      <mac address="fa:16:3e:57:12:ee"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:      <model type="virtio"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:      <mtu size="1442"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:      <target dev="tapd430a736-69"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    </interface>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <serial type="pty">
Dec  5 07:52:18 np0005546954 nova_compute[187160]:      <log file="/var/lib/nova/instances/d8c06fde-c17f-425f-a419-71aa3687ce9d/console.log" append="off"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    </serial>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <video>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:      <model type="virtio"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    </video>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <input type="tablet" bus="usb"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <rng model="virtio">
Dec  5 07:52:18 np0005546954 nova_compute[187160]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    </rng>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <controller type="usb" index="0"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    <memballoon model="virtio">
Dec  5 07:52:18 np0005546954 nova_compute[187160]:      <stats period="10"/>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:    </memballoon>
Dec  5 07:52:18 np0005546954 nova_compute[187160]:  </devices>
Dec  5 07:52:18 np0005546954 nova_compute[187160]: </domain>
Dec  5 07:52:18 np0005546954 nova_compute[187160]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.898 187164 DEBUG nova.compute.manager [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Preparing to wait for external event network-vif-plugged-d430a736-69d8-44c8-a904-c0dd5d569dd7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.899 187164 DEBUG oslo_concurrency.lockutils [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "d8c06fde-c17f-425f-a419-71aa3687ce9d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.899 187164 DEBUG oslo_concurrency.lockutils [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "d8c06fde-c17f-425f-a419-71aa3687ce9d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.899 187164 DEBUG oslo_concurrency.lockutils [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "d8c06fde-c17f-425f-a419-71aa3687ce9d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.900 187164 DEBUG nova.virt.libvirt.vif [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:52:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-452631592',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-452631592',id=11,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e6ae0d0dcde04b85b6dae45560cca988',ramdisk_id='',reservation_id='r-5q5hgc3u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-192029678',owner_user_name='tempest-TestExecuteStrategies-192029678-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:52:15Z,user_data=None,user_id='0ae0bb20ac8b4be99eb1abddc7310436',uuid=d8c06fde-c17f-425f-a419-71aa3687ce9d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d430a736-69d8-44c8-a904-c0dd5d569dd7", "address": "fa:16:3e:57:12:ee", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd430a736-69", "ovs_interfaceid": "d430a736-69d8-44c8-a904-c0dd5d569dd7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.900 187164 DEBUG nova.network.os_vif_util [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converting VIF {"id": "d430a736-69d8-44c8-a904-c0dd5d569dd7", "address": "fa:16:3e:57:12:ee", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd430a736-69", "ovs_interfaceid": "d430a736-69d8-44c8-a904-c0dd5d569dd7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.901 187164 DEBUG nova.network.os_vif_util [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:57:12:ee,bridge_name='br-int',has_traffic_filtering=True,id=d430a736-69d8-44c8-a904-c0dd5d569dd7,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd430a736-69') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.902 187164 DEBUG os_vif [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:12:ee,bridge_name='br-int',has_traffic_filtering=True,id=d430a736-69d8-44c8-a904-c0dd5d569dd7,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd430a736-69') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.902 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.903 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.904 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.908 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.908 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd430a736-69, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.909 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd430a736-69, col_values=(('external_ids', {'iface-id': 'd430a736-69d8-44c8-a904-c0dd5d569dd7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:57:12:ee', 'vm-uuid': 'd8c06fde-c17f-425f-a419-71aa3687ce9d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.911 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:52:18 np0005546954 NetworkManager[55665]: <info>  [1764939138.9132] manager: (tapd430a736-69): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.914 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.922 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.924 187164 INFO os_vif [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:12:ee,bridge_name='br-int',has_traffic_filtering=True,id=d430a736-69d8-44c8-a904-c0dd5d569dd7,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd430a736-69')#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.981 187164 DEBUG nova.virt.libvirt.driver [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.981 187164 DEBUG nova.virt.libvirt.driver [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.982 187164 DEBUG nova.virt.libvirt.driver [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] No VIF found with MAC fa:16:3e:57:12:ee, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:52:18 np0005546954 nova_compute[187160]: 2025-12-05 12:52:18.982 187164 INFO nova.virt.libvirt.driver [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Using config drive#033[00m
Dec  5 07:52:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:52:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:52:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:52:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:52:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:52:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 07:52:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:52:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 07:52:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:52:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:52:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 07:52:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:52:19 np0005546954 nova_compute[187160]: 2025-12-05 12:52:19.714 187164 INFO nova.virt.libvirt.driver [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Creating config drive at /var/lib/nova/instances/d8c06fde-c17f-425f-a419-71aa3687ce9d/disk.config#033[00m
Dec  5 07:52:19 np0005546954 nova_compute[187160]: 2025-12-05 12:52:19.721 187164 DEBUG oslo_concurrency.processutils [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d8c06fde-c17f-425f-a419-71aa3687ce9d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpz1a87yzh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:52:19 np0005546954 nova_compute[187160]: 2025-12-05 12:52:19.822 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:52:19 np0005546954 nova_compute[187160]: 2025-12-05 12:52:19.852 187164 DEBUG oslo_concurrency.processutils [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d8c06fde-c17f-425f-a419-71aa3687ce9d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpz1a87yzh" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:52:19 np0005546954 kernel: tapd430a736-69: entered promiscuous mode
Dec  5 07:52:19 np0005546954 NetworkManager[55665]: <info>  [1764939139.9255] manager: (tapd430a736-69): new Tun device (/org/freedesktop/NetworkManager/Devices/47)
Dec  5 07:52:19 np0005546954 nova_compute[187160]: 2025-12-05 12:52:19.927 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:52:19 np0005546954 ovn_controller[95566]: 2025-12-05T12:52:19Z|00107|binding|INFO|Claiming lport d430a736-69d8-44c8-a904-c0dd5d569dd7 for this chassis.
Dec  5 07:52:19 np0005546954 ovn_controller[95566]: 2025-12-05T12:52:19Z|00108|binding|INFO|d430a736-69d8-44c8-a904-c0dd5d569dd7: Claiming fa:16:3e:57:12:ee 10.100.0.7
Dec  5 07:52:19 np0005546954 nova_compute[187160]: 2025-12-05 12:52:19.935 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:52:19 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:52:19.948 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:57:12:ee 10.100.0.7'], port_security=['fa:16:3e:57:12:ee 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'd8c06fde-c17f-425f-a419-71aa3687ce9d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4389bc8-2898-48b0-9741-5183b54fe83c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e6ae0d0dcde04b85b6dae45560cca988', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9ea68f98-ae7c-4c35-bc5a-7c1a27f7e5f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb60c317-acba-4c06-b29b-f7c6c7a5660a, chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=d430a736-69d8-44c8-a904-c0dd5d569dd7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:52:19 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:52:19.951 104428 INFO neutron.agent.ovn.metadata.agent [-] Port d430a736-69d8-44c8-a904-c0dd5d569dd7 in datapath d4389bc8-2898-48b0-9741-5183b54fe83c bound to our chassis#033[00m
Dec  5 07:52:19 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:52:19.954 104428 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d4389bc8-2898-48b0-9741-5183b54fe83c#033[00m
Dec  5 07:52:19 np0005546954 systemd-udevd[212199]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:52:19 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:52:19.967 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[7f5138ff-42cb-4bdb-86ec-ad2889004fea]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:52:19 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:52:19.969 104428 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd4389bc8-21 in ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:52:19 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:52:19.971 208690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd4389bc8-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:52:19 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:52:19.972 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[cb118344-7e6f-402c-970e-cd937afdc2a1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:52:19 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:52:19.973 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[bd3d8ca0-5c78-45cd-94d8-7453f3fe6d7c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:52:19 np0005546954 NetworkManager[55665]: <info>  [1764939139.9760] device (tapd430a736-69): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:52:19 np0005546954 NetworkManager[55665]: <info>  [1764939139.9769] device (tapd430a736-69): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:52:19 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:52:19.991 104542 DEBUG oslo.privsep.daemon [-] privsep: reply[e8ceb4e0-9d50-452d-907e-6a0b6f181872]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:52:19 np0005546954 nova_compute[187160]: 2025-12-05 12:52:19.999 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:52:20 np0005546954 systemd-machined[153497]: New machine qemu-10-instance-0000000b.
Dec  5 07:52:20 np0005546954 ovn_controller[95566]: 2025-12-05T12:52:20Z|00109|binding|INFO|Setting lport d430a736-69d8-44c8-a904-c0dd5d569dd7 ovn-installed in OVS
Dec  5 07:52:20 np0005546954 ovn_controller[95566]: 2025-12-05T12:52:20Z|00110|binding|INFO|Setting lport d430a736-69d8-44c8-a904-c0dd5d569dd7 up in Southbound
Dec  5 07:52:20 np0005546954 nova_compute[187160]: 2025-12-05 12:52:20.007 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:52:20.019 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[1a19e3e6-1455-460a-bd47-ade50ca6729a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:52:20 np0005546954 systemd[1]: Started Virtual Machine qemu-10-instance-0000000b.
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:52:20.055 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[8f8974c4-bc60-462b-a0f4-538beae9b70e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:52:20.063 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[7b5d687d-9689-4597-b387-06394771e5e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:52:20 np0005546954 NetworkManager[55665]: <info>  [1764939140.0640] manager: (tapd4389bc8-20): new Veth device (/org/freedesktop/NetworkManager/Devices/48)
Dec  5 07:52:20 np0005546954 systemd-udevd[212204]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:52:20.099 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[7a58608e-359a-4e0c-881e-53208ce35c82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:52:20.103 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[d946fe72-5311-4b64-9212-c0602c63b6d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:52:20 np0005546954 NetworkManager[55665]: <info>  [1764939140.1266] device (tapd4389bc8-20): carrier: link connected
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:52:20.133 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[92b6525c-853b-47f3-b00e-df84ad34d1de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:52:20.151 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[22d1bbfd-245d-4553-96c4-c3922237f4f8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd4389bc8-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:43:f7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 414675, 'reachable_time': 31640, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212235, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:52:20.169 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[9a2e6201-c9c5-47f3-bdf5-dee0a80ae562]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7c:43f7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 414675, 'tstamp': 414675}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212236, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:52:20.187 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[9b311dfe-c855-4b59-9701-3da92bdad8b8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd4389bc8-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:43:f7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 414675, 'reachable_time': 31640, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 212237, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:52:20.226 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[d2334919-4c6d-44e3-9bff-c74c83e0af50]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:52:20.291 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[e79d70b0-c0b6-4828-b77a-d9e7f10cab5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:52:20.293 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4389bc8-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:52:20.294 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:52:20.295 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd4389bc8-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:52:20 np0005546954 nova_compute[187160]: 2025-12-05 12:52:20.297 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:52:20 np0005546954 NetworkManager[55665]: <info>  [1764939140.2982] manager: (tapd4389bc8-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/49)
Dec  5 07:52:20 np0005546954 kernel: tapd4389bc8-20: entered promiscuous mode
Dec  5 07:52:20 np0005546954 nova_compute[187160]: 2025-12-05 12:52:20.300 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:52:20.302 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd4389bc8-20, col_values=(('external_ids', {'iface-id': '8dbe2af5-9f18-44ca-8f22-66854bcdd596'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:52:20 np0005546954 nova_compute[187160]: 2025-12-05 12:52:20.304 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:52:20 np0005546954 ovn_controller[95566]: 2025-12-05T12:52:20Z|00111|binding|INFO|Releasing lport 8dbe2af5-9f18-44ca-8f22-66854bcdd596 from this chassis (sb_readonly=0)
Dec  5 07:52:20 np0005546954 nova_compute[187160]: 2025-12-05 12:52:20.329 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:52:20.332 104428 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d4389bc8-2898-48b0-9741-5183b54fe83c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d4389bc8-2898-48b0-9741-5183b54fe83c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:52:20.333 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[a9de1fb0-0c05-4d2c-857c-f53561ed38a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:52:20.334 104428 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]: global
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]:    log         /dev/log local0 debug
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]:    log-tag     haproxy-metadata-proxy-d4389bc8-2898-48b0-9741-5183b54fe83c
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]:    user        root
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]:    group       root
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]:    maxconn     1024
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]:    pidfile     /var/lib/neutron/external/pids/d4389bc8-2898-48b0-9741-5183b54fe83c.pid.haproxy
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]:    daemon
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]: 
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]: defaults
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]:    log global
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]:    mode http
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]:    option httplog
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]:    option dontlognull
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]:    option http-server-close
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]:    option forwardfor
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]:    retries                 3
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]:    timeout http-request    30s
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]:    timeout connect         30s
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]:    timeout client          32s
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]:    timeout server          32s
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]:    timeout http-keep-alive 30s
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]: 
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]: 
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]: listen listener
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]:    bind 169.254.169.254:80
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]:    http-request add-header X-OVN-Network-ID d4389bc8-2898-48b0-9741-5183b54fe83c
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:52:20.336 104428 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'env', 'PROCESS_TAG=haproxy-d4389bc8-2898-48b0-9741-5183b54fe83c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d4389bc8-2898-48b0-9741-5183b54fe83c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:52:20 np0005546954 nova_compute[187160]: 2025-12-05 12:52:20.674 187164 DEBUG nova.compute.manager [req-024031ce-42d4-4273-80b0-8a761df87733 req-1cd7fdd5-2af6-44bc-881e-732019e251c8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Received event network-vif-plugged-d430a736-69d8-44c8-a904-c0dd5d569dd7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:52:20 np0005546954 nova_compute[187160]: 2025-12-05 12:52:20.675 187164 DEBUG oslo_concurrency.lockutils [req-024031ce-42d4-4273-80b0-8a761df87733 req-1cd7fdd5-2af6-44bc-881e-732019e251c8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "d8c06fde-c17f-425f-a419-71aa3687ce9d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:52:20 np0005546954 nova_compute[187160]: 2025-12-05 12:52:20.675 187164 DEBUG oslo_concurrency.lockutils [req-024031ce-42d4-4273-80b0-8a761df87733 req-1cd7fdd5-2af6-44bc-881e-732019e251c8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "d8c06fde-c17f-425f-a419-71aa3687ce9d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:52:20 np0005546954 nova_compute[187160]: 2025-12-05 12:52:20.675 187164 DEBUG oslo_concurrency.lockutils [req-024031ce-42d4-4273-80b0-8a761df87733 req-1cd7fdd5-2af6-44bc-881e-732019e251c8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "d8c06fde-c17f-425f-a419-71aa3687ce9d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:52:20 np0005546954 nova_compute[187160]: 2025-12-05 12:52:20.676 187164 DEBUG nova.compute.manager [req-024031ce-42d4-4273-80b0-8a761df87733 req-1cd7fdd5-2af6-44bc-881e-732019e251c8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Processing event network-vif-plugged-d430a736-69d8-44c8-a904-c0dd5d569dd7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:52:20 np0005546954 podman[212270]: 2025-12-05 12:52:20.760478534 +0000 UTC m=+0.072285579 container create 1f24008db5d17895f424ca346fc9f278bda3122a63689453654f18501dadce7b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:52:20.769 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2a:56:46', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '32:90:88:ab:74:32'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:52:20 np0005546954 nova_compute[187160]: 2025-12-05 12:52:20.773 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:52:20 np0005546954 podman[212270]: 2025-12-05 12:52:20.716944011 +0000 UTC m=+0.028751066 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:52:20 np0005546954 systemd[1]: Started libpod-conmon-1f24008db5d17895f424ca346fc9f278bda3122a63689453654f18501dadce7b.scope.
Dec  5 07:52:20 np0005546954 systemd[1]: Started libcrun container.
Dec  5 07:52:20 np0005546954 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad9161cf92b7f699eab1b887fa35832cea80dd03eae71f040429509a05f1a38f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:52:20 np0005546954 nova_compute[187160]: 2025-12-05 12:52:20.877 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764939140.876251, d8c06fde-c17f-425f-a419-71aa3687ce9d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:52:20 np0005546954 podman[212270]: 2025-12-05 12:52:20.8783932 +0000 UTC m=+0.190200265 container init 1f24008db5d17895f424ca346fc9f278bda3122a63689453654f18501dadce7b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Dec  5 07:52:20 np0005546954 nova_compute[187160]: 2025-12-05 12:52:20.878 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] VM Started (Lifecycle Event)#033[00m
Dec  5 07:52:20 np0005546954 nova_compute[187160]: 2025-12-05 12:52:20.883 187164 DEBUG nova.compute.manager [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:52:20 np0005546954 podman[212270]: 2025-12-05 12:52:20.886380438 +0000 UTC m=+0.198187473 container start 1f24008db5d17895f424ca346fc9f278bda3122a63689453654f18501dadce7b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:52:20 np0005546954 nova_compute[187160]: 2025-12-05 12:52:20.888 187164 DEBUG nova.virt.libvirt.driver [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:52:20 np0005546954 podman[212284]: 2025-12-05 12:52:20.89224787 +0000 UTC m=+0.082951679 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent)
Dec  5 07:52:20 np0005546954 nova_compute[187160]: 2025-12-05 12:52:20.893 187164 INFO nova.virt.libvirt.driver [-] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Instance spawned successfully.#033[00m
Dec  5 07:52:20 np0005546954 nova_compute[187160]: 2025-12-05 12:52:20.894 187164 DEBUG nova.virt.libvirt.driver [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:52:20 np0005546954 nova_compute[187160]: 2025-12-05 12:52:20.901 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:52:20 np0005546954 nova_compute[187160]: 2025-12-05 12:52:20.905 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:52:20 np0005546954 nova_compute[187160]: 2025-12-05 12:52:20.913 187164 DEBUG nova.virt.libvirt.driver [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:52:20 np0005546954 nova_compute[187160]: 2025-12-05 12:52:20.913 187164 DEBUG nova.virt.libvirt.driver [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:52:20 np0005546954 nova_compute[187160]: 2025-12-05 12:52:20.913 187164 DEBUG nova.virt.libvirt.driver [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:52:20 np0005546954 nova_compute[187160]: 2025-12-05 12:52:20.914 187164 DEBUG nova.virt.libvirt.driver [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:52:20 np0005546954 nova_compute[187160]: 2025-12-05 12:52:20.914 187164 DEBUG nova.virt.libvirt.driver [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:52:20 np0005546954 nova_compute[187160]: 2025-12-05 12:52:20.915 187164 DEBUG nova.virt.libvirt.driver [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:52:20 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[212298]: [NOTICE]   (212312) : New worker (212314) forked
Dec  5 07:52:20 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[212298]: [NOTICE]   (212312) : Loading success.
Dec  5 07:52:20 np0005546954 nova_compute[187160]: 2025-12-05 12:52:20.922 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:52:20 np0005546954 nova_compute[187160]: 2025-12-05 12:52:20.922 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764939140.8781438, d8c06fde-c17f-425f-a419-71aa3687ce9d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:52:20 np0005546954 nova_compute[187160]: 2025-12-05 12:52:20.922 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:52:20 np0005546954 nova_compute[187160]: 2025-12-05 12:52:20.943 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:52:20 np0005546954 nova_compute[187160]: 2025-12-05 12:52:20.950 187164 DEBUG nova.network.neutron [req-c9aeb5a5-e806-4510-b6f6-4598a4df469f req-7f7a2bd5-399e-4113-984c-2ede8ee0dd3a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Updated VIF entry in instance network info cache for port d430a736-69d8-44c8-a904-c0dd5d569dd7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:52:20 np0005546954 nova_compute[187160]: 2025-12-05 12:52:20.951 187164 DEBUG nova.network.neutron [req-c9aeb5a5-e806-4510-b6f6-4598a4df469f req-7f7a2bd5-399e-4113-984c-2ede8ee0dd3a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Updating instance_info_cache with network_info: [{"id": "d430a736-69d8-44c8-a904-c0dd5d569dd7", "address": "fa:16:3e:57:12:ee", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd430a736-69", "ovs_interfaceid": "d430a736-69d8-44c8-a904-c0dd5d569dd7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:52:20 np0005546954 nova_compute[187160]: 2025-12-05 12:52:20.955 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764939140.8873453, d8c06fde-c17f-425f-a419-71aa3687ce9d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:52:20 np0005546954 nova_compute[187160]: 2025-12-05 12:52:20.955 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:52:20 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:52:20.964 104428 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 07:52:20 np0005546954 nova_compute[187160]: 2025-12-05 12:52:20.972 187164 DEBUG oslo_concurrency.lockutils [req-c9aeb5a5-e806-4510-b6f6-4598a4df469f req-7f7a2bd5-399e-4113-984c-2ede8ee0dd3a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Releasing lock "refresh_cache-d8c06fde-c17f-425f-a419-71aa3687ce9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:52:20 np0005546954 nova_compute[187160]: 2025-12-05 12:52:20.978 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:52:20 np0005546954 nova_compute[187160]: 2025-12-05 12:52:20.984 187164 INFO nova.compute.manager [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Took 5.65 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:52:20 np0005546954 nova_compute[187160]: 2025-12-05 12:52:20.985 187164 DEBUG nova.compute.manager [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:52:20 np0005546954 nova_compute[187160]: 2025-12-05 12:52:20.988 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:52:21 np0005546954 nova_compute[187160]: 2025-12-05 12:52:21.017 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:52:21 np0005546954 nova_compute[187160]: 2025-12-05 12:52:21.055 187164 INFO nova.compute.manager [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Took 6.10 seconds to build instance.#033[00m
Dec  5 07:52:21 np0005546954 nova_compute[187160]: 2025-12-05 12:52:21.073 187164 DEBUG oslo_concurrency.lockutils [None req-d60b0b21-be68-4e9b-85a4-e4fd2c52f160 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "d8c06fde-c17f-425f-a419-71aa3687ce9d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.170s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:52:22 np0005546954 nova_compute[187160]: 2025-12-05 12:52:22.782 187164 DEBUG nova.compute.manager [req-f40c9677-0e11-4b8d-86c8-07bdccf18f7d req-353484a9-f2b2-44c5-907a-0bfd7e5852fa 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Received event network-vif-plugged-d430a736-69d8-44c8-a904-c0dd5d569dd7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:52:22 np0005546954 nova_compute[187160]: 2025-12-05 12:52:22.782 187164 DEBUG oslo_concurrency.lockutils [req-f40c9677-0e11-4b8d-86c8-07bdccf18f7d req-353484a9-f2b2-44c5-907a-0bfd7e5852fa 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "d8c06fde-c17f-425f-a419-71aa3687ce9d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:52:22 np0005546954 nova_compute[187160]: 2025-12-05 12:52:22.782 187164 DEBUG oslo_concurrency.lockutils [req-f40c9677-0e11-4b8d-86c8-07bdccf18f7d req-353484a9-f2b2-44c5-907a-0bfd7e5852fa 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "d8c06fde-c17f-425f-a419-71aa3687ce9d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:52:22 np0005546954 nova_compute[187160]: 2025-12-05 12:52:22.783 187164 DEBUG oslo_concurrency.lockutils [req-f40c9677-0e11-4b8d-86c8-07bdccf18f7d req-353484a9-f2b2-44c5-907a-0bfd7e5852fa 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "d8c06fde-c17f-425f-a419-71aa3687ce9d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:52:22 np0005546954 nova_compute[187160]: 2025-12-05 12:52:22.783 187164 DEBUG nova.compute.manager [req-f40c9677-0e11-4b8d-86c8-07bdccf18f7d req-353484a9-f2b2-44c5-907a-0bfd7e5852fa 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] No waiting events found dispatching network-vif-plugged-d430a736-69d8-44c8-a904-c0dd5d569dd7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:52:22 np0005546954 nova_compute[187160]: 2025-12-05 12:52:22.783 187164 WARNING nova.compute.manager [req-f40c9677-0e11-4b8d-86c8-07bdccf18f7d req-353484a9-f2b2-44c5-907a-0bfd7e5852fa 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Received unexpected event network-vif-plugged-d430a736-69d8-44c8-a904-c0dd5d569dd7 for instance with vm_state active and task_state None.#033[00m
Dec  5 07:52:23 np0005546954 nova_compute[187160]: 2025-12-05 12:52:23.913 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:52:24 np0005546954 nova_compute[187160]: 2025-12-05 12:52:24.869 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:52:25 np0005546954 podman[212326]: 2025-12-05 12:52:25.543819739 +0000 UTC m=+0.056653602 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 07:52:25 np0005546954 podman[212325]: 2025-12-05 12:52:25.57793881 +0000 UTC m=+0.091908258 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  5 07:52:26 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:52:26.967 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f9f74c-08f9-451f-9678-93bb9e8fa2fe, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:52:28 np0005546954 nova_compute[187160]: 2025-12-05 12:52:28.916 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:52:29 np0005546954 nova_compute[187160]: 2025-12-05 12:52:29.908 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:52:33 np0005546954 nova_compute[187160]: 2025-12-05 12:52:33.920 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:52:34 np0005546954 ovn_controller[95566]: 2025-12-05T12:52:34Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:57:12:ee 10.100.0.7
Dec  5 07:52:34 np0005546954 ovn_controller[95566]: 2025-12-05T12:52:34Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:57:12:ee 10.100.0.7
Dec  5 07:52:34 np0005546954 nova_compute[187160]: 2025-12-05 12:52:34.973 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:52:35 np0005546954 podman[197513]: time="2025-12-05T12:52:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:52:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:52:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  5 07:52:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:52:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3035 "" "Go-http-client/1.1"
Dec  5 07:52:38 np0005546954 podman[212390]: 2025-12-05 12:52:38.579826235 +0000 UTC m=+0.087306745 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.buildah.version=1.33.7, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., version=9.6, distribution-scope=public, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec  5 07:52:38 np0005546954 podman[212391]: 2025-12-05 12:52:38.604705388 +0000 UTC m=+0.101664321 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=multipathd, io.buildah.version=1.41.3)
Dec  5 07:52:38 np0005546954 nova_compute[187160]: 2025-12-05 12:52:38.923 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:52:40 np0005546954 nova_compute[187160]: 2025-12-05 12:52:40.020 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:52:43 np0005546954 nova_compute[187160]: 2025-12-05 12:52:43.925 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:52:45 np0005546954 nova_compute[187160]: 2025-12-05 12:52:45.023 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:52:48 np0005546954 nova_compute[187160]: 2025-12-05 12:52:48.973 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:52:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:52:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:52:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:52:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 07:52:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:52:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:52:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:52:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 07:52:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:52:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:52:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 07:52:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:52:50 np0005546954 nova_compute[187160]: 2025-12-05 12:52:50.090 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:52:51 np0005546954 podman[212430]: 2025-12-05 12:52:51.54536052 +0000 UTC m=+0.057107717 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:52:53 np0005546954 nova_compute[187160]: 2025-12-05 12:52:53.977 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:52:55 np0005546954 nova_compute[187160]: 2025-12-05 12:52:55.126 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:52:56 np0005546954 podman[212451]: 2025-12-05 12:52:56.559431599 +0000 UTC m=+0.065445576 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec  5 07:52:56 np0005546954 podman[212450]: 2025-12-05 12:52:56.596237123 +0000 UTC m=+0.104925983 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:52:57 np0005546954 nova_compute[187160]: 2025-12-05 12:52:57.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:52:57 np0005546954 nova_compute[187160]: 2025-12-05 12:52:57.041 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:52:57 np0005546954 nova_compute[187160]: 2025-12-05 12:52:57.041 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:52:57 np0005546954 nova_compute[187160]: 2025-12-05 12:52:57.530 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "refresh_cache-d8c06fde-c17f-425f-a419-71aa3687ce9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:52:57 np0005546954 nova_compute[187160]: 2025-12-05 12:52:57.531 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquired lock "refresh_cache-d8c06fde-c17f-425f-a419-71aa3687ce9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:52:57 np0005546954 nova_compute[187160]: 2025-12-05 12:52:57.531 187164 DEBUG nova.network.neutron [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec  5 07:52:57 np0005546954 nova_compute[187160]: 2025-12-05 12:52:57.531 187164 DEBUG nova.objects.instance [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lazy-loading 'info_cache' on Instance uuid d8c06fde-c17f-425f-a419-71aa3687ce9d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:52:58 np0005546954 nova_compute[187160]: 2025-12-05 12:52:58.980 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:52:59 np0005546954 nova_compute[187160]: 2025-12-05 12:52:59.183 187164 DEBUG nova.network.neutron [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Updating instance_info_cache with network_info: [{"id": "d430a736-69d8-44c8-a904-c0dd5d569dd7", "address": "fa:16:3e:57:12:ee", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd430a736-69", "ovs_interfaceid": "d430a736-69d8-44c8-a904-c0dd5d569dd7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:52:59 np0005546954 nova_compute[187160]: 2025-12-05 12:52:59.246 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Releasing lock "refresh_cache-d8c06fde-c17f-425f-a419-71aa3687ce9d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:52:59 np0005546954 nova_compute[187160]: 2025-12-05 12:52:59.247 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec  5 07:52:59 np0005546954 nova_compute[187160]: 2025-12-05 12:52:59.247 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:52:59 np0005546954 nova_compute[187160]: 2025-12-05 12:52:59.248 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:52:59 np0005546954 nova_compute[187160]: 2025-12-05 12:52:59.248 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:53:00 np0005546954 nova_compute[187160]: 2025-12-05 12:53:00.129 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:03 np0005546954 ovn_controller[95566]: 2025-12-05T12:53:03Z|00112|memory_trim|INFO|Detected inactivity (last active 30008 ms ago): trimming memory
Dec  5 07:53:04 np0005546954 nova_compute[187160]: 2025-12-05 12:53:04.016 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:05 np0005546954 nova_compute[187160]: 2025-12-05 12:53:05.185 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:05 np0005546954 podman[197513]: time="2025-12-05T12:53:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:53:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:53:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  5 07:53:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:53:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3057 "" "Go-http-client/1.1"
Dec  5 07:53:06 np0005546954 nova_compute[187160]: 2025-12-05 12:53:06.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:53:06 np0005546954 nova_compute[187160]: 2025-12-05 12:53:06.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:53:06 np0005546954 nova_compute[187160]: 2025-12-05 12:53:06.058 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:53:06 np0005546954 nova_compute[187160]: 2025-12-05 12:53:06.059 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:53:06 np0005546954 nova_compute[187160]: 2025-12-05 12:53:06.059 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:53:06 np0005546954 nova_compute[187160]: 2025-12-05 12:53:06.079 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:53:06 np0005546954 nova_compute[187160]: 2025-12-05 12:53:06.079 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:53:06 np0005546954 nova_compute[187160]: 2025-12-05 12:53:06.079 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:53:06 np0005546954 nova_compute[187160]: 2025-12-05 12:53:06.079 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:53:06 np0005546954 nova_compute[187160]: 2025-12-05 12:53:06.146 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d8c06fde-c17f-425f-a419-71aa3687ce9d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:53:06 np0005546954 nova_compute[187160]: 2025-12-05 12:53:06.244 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d8c06fde-c17f-425f-a419-71aa3687ce9d/disk --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:53:06 np0005546954 nova_compute[187160]: 2025-12-05 12:53:06.245 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d8c06fde-c17f-425f-a419-71aa3687ce9d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:53:06 np0005546954 nova_compute[187160]: 2025-12-05 12:53:06.309 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d8c06fde-c17f-425f-a419-71aa3687ce9d/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:53:06 np0005546954 nova_compute[187160]: 2025-12-05 12:53:06.485 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:53:06 np0005546954 nova_compute[187160]: 2025-12-05 12:53:06.487 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5713MB free_disk=73.30756378173828GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:53:06 np0005546954 nova_compute[187160]: 2025-12-05 12:53:06.487 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:53:06 np0005546954 nova_compute[187160]: 2025-12-05 12:53:06.487 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:53:06 np0005546954 nova_compute[187160]: 2025-12-05 12:53:06.565 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Instance d8c06fde-c17f-425f-a419-71aa3687ce9d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:53:06 np0005546954 nova_compute[187160]: 2025-12-05 12:53:06.565 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:53:06 np0005546954 nova_compute[187160]: 2025-12-05 12:53:06.566 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:53:06 np0005546954 nova_compute[187160]: 2025-12-05 12:53:06.621 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:53:06 np0005546954 nova_compute[187160]: 2025-12-05 12:53:06.635 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:53:06 np0005546954 nova_compute[187160]: 2025-12-05 12:53:06.663 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:53:06 np0005546954 nova_compute[187160]: 2025-12-05 12:53:06.664 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:53:07 np0005546954 nova_compute[187160]: 2025-12-05 12:53:07.645 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:53:08 np0005546954 nova_compute[187160]: 2025-12-05 12:53:08.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:53:09 np0005546954 nova_compute[187160]: 2025-12-05 12:53:09.050 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:09 np0005546954 nova_compute[187160]: 2025-12-05 12:53:09.130 187164 DEBUG nova.virt.libvirt.driver [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] Creating tmpfile /var/lib/nova/instances/tmpny8sfs9c to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Dec  5 07:53:09 np0005546954 nova_compute[187160]: 2025-12-05 12:53:09.132 187164 DEBUG nova.compute.manager [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpny8sfs9c',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Dec  5 07:53:09 np0005546954 podman[212506]: 2025-12-05 12:53:09.561312433 +0000 UTC m=+0.065948280 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, name=ubi9-minimal)
Dec  5 07:53:09 np0005546954 podman[212507]: 2025-12-05 12:53:09.58242113 +0000 UTC m=+0.088432831 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec  5 07:53:10 np0005546954 nova_compute[187160]: 2025-12-05 12:53:10.214 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:10 np0005546954 nova_compute[187160]: 2025-12-05 12:53:10.331 187164 DEBUG nova.compute.manager [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpny8sfs9c',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7b88fad7-6e01-4547-a1cc-39dff2d6f7e0',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Dec  5 07:53:10 np0005546954 nova_compute[187160]: 2025-12-05 12:53:10.367 187164 DEBUG oslo_concurrency.lockutils [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "refresh_cache-7b88fad7-6e01-4547-a1cc-39dff2d6f7e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:53:10 np0005546954 nova_compute[187160]: 2025-12-05 12:53:10.367 187164 DEBUG oslo_concurrency.lockutils [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquired lock "refresh_cache-7b88fad7-6e01-4547-a1cc-39dff2d6f7e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:53:10 np0005546954 nova_compute[187160]: 2025-12-05 12:53:10.368 187164 DEBUG nova.network.neutron [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:53:11 np0005546954 nova_compute[187160]: 2025-12-05 12:53:11.663 187164 DEBUG nova.network.neutron [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] Updating instance_info_cache with network_info: [{"id": "a0d238fe-3933-4631-8173-efaff134884c", "address": "fa:16:3e:dd:62:aa", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0d238fe-39", "ovs_interfaceid": "a0d238fe-3933-4631-8173-efaff134884c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:53:11 np0005546954 nova_compute[187160]: 2025-12-05 12:53:11.690 187164 DEBUG oslo_concurrency.lockutils [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Releasing lock "refresh_cache-7b88fad7-6e01-4547-a1cc-39dff2d6f7e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:53:11 np0005546954 nova_compute[187160]: 2025-12-05 12:53:11.693 187164 DEBUG nova.virt.libvirt.driver [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpny8sfs9c',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7b88fad7-6e01-4547-a1cc-39dff2d6f7e0',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Dec  5 07:53:11 np0005546954 nova_compute[187160]: 2025-12-05 12:53:11.694 187164 DEBUG nova.virt.libvirt.driver [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] Creating instance directory: /var/lib/nova/instances/7b88fad7-6e01-4547-a1cc-39dff2d6f7e0 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Dec  5 07:53:11 np0005546954 nova_compute[187160]: 2025-12-05 12:53:11.694 187164 DEBUG nova.virt.libvirt.driver [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] Creating disk.info with the contents: {'/var/lib/nova/instances/7b88fad7-6e01-4547-a1cc-39dff2d6f7e0/disk': 'qcow2', '/var/lib/nova/instances/7b88fad7-6e01-4547-a1cc-39dff2d6f7e0/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Dec  5 07:53:11 np0005546954 nova_compute[187160]: 2025-12-05 12:53:11.695 187164 DEBUG nova.virt.libvirt.driver [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Dec  5 07:53:11 np0005546954 nova_compute[187160]: 2025-12-05 12:53:11.695 187164 DEBUG nova.objects.instance [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lazy-loading 'trusted_certs' on Instance uuid 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:53:11 np0005546954 nova_compute[187160]: 2025-12-05 12:53:11.726 187164 DEBUG oslo_concurrency.processutils [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:53:11 np0005546954 nova_compute[187160]: 2025-12-05 12:53:11.805 187164 DEBUG oslo_concurrency.processutils [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:53:11 np0005546954 nova_compute[187160]: 2025-12-05 12:53:11.806 187164 DEBUG oslo_concurrency.lockutils [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:53:11 np0005546954 nova_compute[187160]: 2025-12-05 12:53:11.807 187164 DEBUG oslo_concurrency.lockutils [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:53:11 np0005546954 nova_compute[187160]: 2025-12-05 12:53:11.817 187164 DEBUG oslo_concurrency.processutils [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:53:11 np0005546954 nova_compute[187160]: 2025-12-05 12:53:11.877 187164 DEBUG oslo_concurrency.processutils [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:53:11 np0005546954 nova_compute[187160]: 2025-12-05 12:53:11.878 187164 DEBUG oslo_concurrency.processutils [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/7b88fad7-6e01-4547-a1cc-39dff2d6f7e0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:53:11 np0005546954 nova_compute[187160]: 2025-12-05 12:53:11.932 187164 DEBUG oslo_concurrency.processutils [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/7b88fad7-6e01-4547-a1cc-39dff2d6f7e0/disk 1073741824" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:53:11 np0005546954 nova_compute[187160]: 2025-12-05 12:53:11.933 187164 DEBUG oslo_concurrency.lockutils [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:53:11 np0005546954 nova_compute[187160]: 2025-12-05 12:53:11.934 187164 DEBUG oslo_concurrency.processutils [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:53:12 np0005546954 nova_compute[187160]: 2025-12-05 12:53:11.999 187164 DEBUG oslo_concurrency.processutils [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:53:12 np0005546954 nova_compute[187160]: 2025-12-05 12:53:12.001 187164 DEBUG nova.virt.disk.api [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Checking if we can resize image /var/lib/nova/instances/7b88fad7-6e01-4547-a1cc-39dff2d6f7e0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:53:12 np0005546954 nova_compute[187160]: 2025-12-05 12:53:12.001 187164 DEBUG oslo_concurrency.processutils [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b88fad7-6e01-4547-a1cc-39dff2d6f7e0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:53:12 np0005546954 nova_compute[187160]: 2025-12-05 12:53:12.059 187164 DEBUG oslo_concurrency.processutils [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b88fad7-6e01-4547-a1cc-39dff2d6f7e0/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:53:12 np0005546954 nova_compute[187160]: 2025-12-05 12:53:12.060 187164 DEBUG nova.virt.disk.api [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Cannot resize image /var/lib/nova/instances/7b88fad7-6e01-4547-a1cc-39dff2d6f7e0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:53:12 np0005546954 nova_compute[187160]: 2025-12-05 12:53:12.060 187164 DEBUG nova.objects.instance [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lazy-loading 'migration_context' on Instance uuid 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:53:12 np0005546954 nova_compute[187160]: 2025-12-05 12:53:12.076 187164 DEBUG oslo_concurrency.processutils [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/7b88fad7-6e01-4547-a1cc-39dff2d6f7e0/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:53:12 np0005546954 nova_compute[187160]: 2025-12-05 12:53:12.101 187164 DEBUG oslo_concurrency.processutils [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/7b88fad7-6e01-4547-a1cc-39dff2d6f7e0/disk.config 485376" returned: 0 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:53:12 np0005546954 nova_compute[187160]: 2025-12-05 12:53:12.103 187164 DEBUG nova.virt.libvirt.volume.remotefs [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/7b88fad7-6e01-4547-a1cc-39dff2d6f7e0/disk.config to /var/lib/nova/instances/7b88fad7-6e01-4547-a1cc-39dff2d6f7e0 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Dec  5 07:53:12 np0005546954 nova_compute[187160]: 2025-12-05 12:53:12.103 187164 DEBUG oslo_concurrency.processutils [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/7b88fad7-6e01-4547-a1cc-39dff2d6f7e0/disk.config /var/lib/nova/instances/7b88fad7-6e01-4547-a1cc-39dff2d6f7e0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:53:12 np0005546954 nova_compute[187160]: 2025-12-05 12:53:12.623 187164 DEBUG oslo_concurrency.processutils [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/7b88fad7-6e01-4547-a1cc-39dff2d6f7e0/disk.config /var/lib/nova/instances/7b88fad7-6e01-4547-a1cc-39dff2d6f7e0" returned: 0 in 0.520s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:53:12 np0005546954 nova_compute[187160]: 2025-12-05 12:53:12.624 187164 DEBUG nova.virt.libvirt.driver [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Dec  5 07:53:12 np0005546954 nova_compute[187160]: 2025-12-05 12:53:12.626 187164 DEBUG nova.virt.libvirt.vif [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:52:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-795851383',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-795851383',id=12,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:52:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e6ae0d0dcde04b85b6dae45560cca988',ramdisk_id='',reservation_id='r-xt4ngszi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-192029678',owner_user_name='tempest-TestExecuteStrategies-192029678-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:52:39Z,user_data=None,user_id='0ae0bb20ac8b4be99eb1abddc7310436',uuid=7b88fad7-6e01-4547-a1cc-39dff2d6f7e0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a0d238fe-3933-4631-8173-efaff134884c", "address": "fa:16:3e:dd:62:aa", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapa0d238fe-39", "ovs_interfaceid": "a0d238fe-3933-4631-8173-efaff134884c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:53:12 np0005546954 nova_compute[187160]: 2025-12-05 12:53:12.626 187164 DEBUG nova.network.os_vif_util [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Converting VIF {"id": "a0d238fe-3933-4631-8173-efaff134884c", "address": "fa:16:3e:dd:62:aa", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapa0d238fe-39", "ovs_interfaceid": "a0d238fe-3933-4631-8173-efaff134884c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:53:12 np0005546954 nova_compute[187160]: 2025-12-05 12:53:12.627 187164 DEBUG nova.network.os_vif_util [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:dd:62:aa,bridge_name='br-int',has_traffic_filtering=True,id=a0d238fe-3933-4631-8173-efaff134884c,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa0d238fe-39') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:53:12 np0005546954 nova_compute[187160]: 2025-12-05 12:53:12.627 187164 DEBUG os_vif [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:dd:62:aa,bridge_name='br-int',has_traffic_filtering=True,id=a0d238fe-3933-4631-8173-efaff134884c,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa0d238fe-39') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:53:12 np0005546954 nova_compute[187160]: 2025-12-05 12:53:12.628 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:12 np0005546954 nova_compute[187160]: 2025-12-05 12:53:12.628 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:53:12 np0005546954 nova_compute[187160]: 2025-12-05 12:53:12.629 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:53:12 np0005546954 nova_compute[187160]: 2025-12-05 12:53:12.632 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:12 np0005546954 nova_compute[187160]: 2025-12-05 12:53:12.633 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa0d238fe-39, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:53:12 np0005546954 nova_compute[187160]: 2025-12-05 12:53:12.633 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa0d238fe-39, col_values=(('external_ids', {'iface-id': 'a0d238fe-3933-4631-8173-efaff134884c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:dd:62:aa', 'vm-uuid': '7b88fad7-6e01-4547-a1cc-39dff2d6f7e0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:53:12 np0005546954 nova_compute[187160]: 2025-12-05 12:53:12.635 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:12 np0005546954 NetworkManager[55665]: <info>  [1764939192.6363] manager: (tapa0d238fe-39): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/50)
Dec  5 07:53:12 np0005546954 nova_compute[187160]: 2025-12-05 12:53:12.639 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:53:12 np0005546954 nova_compute[187160]: 2025-12-05 12:53:12.642 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:12 np0005546954 nova_compute[187160]: 2025-12-05 12:53:12.643 187164 INFO os_vif [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:dd:62:aa,bridge_name='br-int',has_traffic_filtering=True,id=a0d238fe-3933-4631-8173-efaff134884c,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa0d238fe-39')#033[00m
Dec  5 07:53:12 np0005546954 nova_compute[187160]: 2025-12-05 12:53:12.644 187164 DEBUG nova.virt.libvirt.driver [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Dec  5 07:53:12 np0005546954 nova_compute[187160]: 2025-12-05 12:53:12.644 187164 DEBUG nova.compute.manager [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpny8sfs9c',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7b88fad7-6e01-4547-a1cc-39dff2d6f7e0',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Dec  5 07:53:15 np0005546954 nova_compute[187160]: 2025-12-05 12:53:15.218 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:16 np0005546954 nova_compute[187160]: 2025-12-05 12:53:16.787 187164 DEBUG nova.network.neutron [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] Port a0d238fe-3933-4631-8173-efaff134884c updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Dec  5 07:53:16 np0005546954 nova_compute[187160]: 2025-12-05 12:53:16.789 187164 DEBUG nova.compute.manager [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpny8sfs9c',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7b88fad7-6e01-4547-a1cc-39dff2d6f7e0',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Dec  5 07:53:16 np0005546954 systemd[1]: Starting libvirt proxy daemon...
Dec  5 07:53:16 np0005546954 systemd[1]: Started libvirt proxy daemon.
Dec  5 07:53:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:16.951 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:53:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:16.953 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:53:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:16.954 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:53:17 np0005546954 NetworkManager[55665]: <info>  [1764939197.1080] manager: (tapa0d238fe-39): new Tun device (/org/freedesktop/NetworkManager/Devices/51)
Dec  5 07:53:17 np0005546954 kernel: tapa0d238fe-39: entered promiscuous mode
Dec  5 07:53:17 np0005546954 nova_compute[187160]: 2025-12-05 12:53:17.110 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:17 np0005546954 ovn_controller[95566]: 2025-12-05T12:53:17Z|00113|binding|INFO|Claiming lport a0d238fe-3933-4631-8173-efaff134884c for this additional chassis.
Dec  5 07:53:17 np0005546954 ovn_controller[95566]: 2025-12-05T12:53:17Z|00114|binding|INFO|a0d238fe-3933-4631-8173-efaff134884c: Claiming fa:16:3e:dd:62:aa 10.100.0.8
Dec  5 07:53:17 np0005546954 ovn_controller[95566]: 2025-12-05T12:53:17Z|00115|binding|INFO|Setting lport a0d238fe-3933-4631-8173-efaff134884c ovn-installed in OVS
Dec  5 07:53:17 np0005546954 nova_compute[187160]: 2025-12-05 12:53:17.130 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:17 np0005546954 nova_compute[187160]: 2025-12-05 12:53:17.135 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:17 np0005546954 systemd-udevd[212600]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:53:17 np0005546954 NetworkManager[55665]: <info>  [1764939197.1579] device (tapa0d238fe-39): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:53:17 np0005546954 NetworkManager[55665]: <info>  [1764939197.1587] device (tapa0d238fe-39): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:53:17 np0005546954 systemd-machined[153497]: New machine qemu-11-instance-0000000c.
Dec  5 07:53:17 np0005546954 systemd[1]: Started Virtual Machine qemu-11-instance-0000000c.
Dec  5 07:53:17 np0005546954 nova_compute[187160]: 2025-12-05 12:53:17.635 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:17 np0005546954 nova_compute[187160]: 2025-12-05 12:53:17.949 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764939197.9490557, 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:53:17 np0005546954 nova_compute[187160]: 2025-12-05 12:53:17.950 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] VM Started (Lifecycle Event)#033[00m
Dec  5 07:53:17 np0005546954 nova_compute[187160]: 2025-12-05 12:53:17.970 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:53:18 np0005546954 nova_compute[187160]: 2025-12-05 12:53:18.988 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764939198.9880743, 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:53:18 np0005546954 nova_compute[187160]: 2025-12-05 12:53:18.989 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:53:19 np0005546954 nova_compute[187160]: 2025-12-05 12:53:19.013 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:53:19 np0005546954 nova_compute[187160]: 2025-12-05 12:53:19.016 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:53:19 np0005546954 nova_compute[187160]: 2025-12-05 12:53:19.040 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Dec  5 07:53:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:53:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:53:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:53:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:53:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:53:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 07:53:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:53:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 07:53:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:53:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:53:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 07:53:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:53:20 np0005546954 nova_compute[187160]: 2025-12-05 12:53:20.220 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:20 np0005546954 ovn_controller[95566]: 2025-12-05T12:53:20Z|00116|binding|INFO|Claiming lport a0d238fe-3933-4631-8173-efaff134884c for this chassis.
Dec  5 07:53:20 np0005546954 ovn_controller[95566]: 2025-12-05T12:53:20Z|00117|binding|INFO|a0d238fe-3933-4631-8173-efaff134884c: Claiming fa:16:3e:dd:62:aa 10.100.0.8
Dec  5 07:53:20 np0005546954 ovn_controller[95566]: 2025-12-05T12:53:20Z|00118|binding|INFO|Setting lport a0d238fe-3933-4631-8173-efaff134884c up in Southbound
Dec  5 07:53:20 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:20.383 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:62:aa 10.100.0.8'], port_security=['fa:16:3e:dd:62:aa 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '7b88fad7-6e01-4547-a1cc-39dff2d6f7e0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4389bc8-2898-48b0-9741-5183b54fe83c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e6ae0d0dcde04b85b6dae45560cca988', 'neutron:revision_number': '11', 'neutron:security_group_ids': '9ea68f98-ae7c-4c35-bc5a-7c1a27f7e5f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb60c317-acba-4c06-b29b-f7c6c7a5660a, chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=a0d238fe-3933-4631-8173-efaff134884c) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:53:20 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:20.385 104428 INFO neutron.agent.ovn.metadata.agent [-] Port a0d238fe-3933-4631-8173-efaff134884c in datapath d4389bc8-2898-48b0-9741-5183b54fe83c bound to our chassis#033[00m
Dec  5 07:53:20 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:20.387 104428 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d4389bc8-2898-48b0-9741-5183b54fe83c#033[00m
Dec  5 07:53:20 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:20.405 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[c4da0c60-bec7-4ada-a831-9a7956826371]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:53:20 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:20.439 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[7b65f639-9f07-474e-a025-a806ed6887be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:53:20 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:20.443 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[42b15d75-0e30-454a-8448-e2e0f4e3643f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:53:20 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:20.475 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[86b4c9e8-ef68-4745-9927-ab98ffcb4135]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:53:20 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:20.497 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[3bddf9c0-ac54-4b4f-a11a-6d73611b0f36]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd4389bc8-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:43:f7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 414675, 'reachable_time': 31640, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212637, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:53:20 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:20.518 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[fa109566-9a0e-4e4e-b158-e6d7bbe6143f]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd4389bc8-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 414687, 'tstamp': 414687}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212638, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd4389bc8-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 414691, 'tstamp': 414691}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212638, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:53:20 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:20.521 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4389bc8-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:53:20 np0005546954 nova_compute[187160]: 2025-12-05 12:53:20.523 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:20 np0005546954 nova_compute[187160]: 2025-12-05 12:53:20.524 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:20 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:20.524 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd4389bc8-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:53:20 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:20.525 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:53:20 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:20.525 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd4389bc8-20, col_values=(('external_ids', {'iface-id': '8dbe2af5-9f18-44ca-8f22-66854bcdd596'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:53:20 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:20.526 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:53:20 np0005546954 nova_compute[187160]: 2025-12-05 12:53:20.569 187164 INFO nova.compute.manager [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] Post operation of migration started#033[00m
Dec  5 07:53:20 np0005546954 nova_compute[187160]: 2025-12-05 12:53:20.883 187164 DEBUG oslo_concurrency.lockutils [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "refresh_cache-7b88fad7-6e01-4547-a1cc-39dff2d6f7e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:53:20 np0005546954 nova_compute[187160]: 2025-12-05 12:53:20.884 187164 DEBUG oslo_concurrency.lockutils [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquired lock "refresh_cache-7b88fad7-6e01-4547-a1cc-39dff2d6f7e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:53:20 np0005546954 nova_compute[187160]: 2025-12-05 12:53:20.886 187164 DEBUG nova.network.neutron [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:53:22 np0005546954 podman[212639]: 2025-12-05 12:53:22.566206965 +0000 UTC m=+0.072686089 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  5 07:53:22 np0005546954 nova_compute[187160]: 2025-12-05 12:53:22.576 187164 DEBUG nova.network.neutron [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] Updating instance_info_cache with network_info: [{"id": "a0d238fe-3933-4631-8173-efaff134884c", "address": "fa:16:3e:dd:62:aa", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0d238fe-39", "ovs_interfaceid": "a0d238fe-3933-4631-8173-efaff134884c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:53:22 np0005546954 nova_compute[187160]: 2025-12-05 12:53:22.598 187164 DEBUG oslo_concurrency.lockutils [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Releasing lock "refresh_cache-7b88fad7-6e01-4547-a1cc-39dff2d6f7e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:53:22 np0005546954 nova_compute[187160]: 2025-12-05 12:53:22.616 187164 DEBUG oslo_concurrency.lockutils [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:53:22 np0005546954 nova_compute[187160]: 2025-12-05 12:53:22.616 187164 DEBUG oslo_concurrency.lockutils [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:53:22 np0005546954 nova_compute[187160]: 2025-12-05 12:53:22.616 187164 DEBUG oslo_concurrency.lockutils [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:53:22 np0005546954 nova_compute[187160]: 2025-12-05 12:53:22.622 187164 INFO nova.virt.libvirt.driver [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Dec  5 07:53:22 np0005546954 virtqemud[186730]: Domain id=11 name='instance-0000000c' uuid=7b88fad7-6e01-4547-a1cc-39dff2d6f7e0 is tainted: custom-monitor
Dec  5 07:53:22 np0005546954 nova_compute[187160]: 2025-12-05 12:53:22.640 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:23 np0005546954 nova_compute[187160]: 2025-12-05 12:53:23.630 187164 INFO nova.virt.libvirt.driver [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Dec  5 07:53:24 np0005546954 nova_compute[187160]: 2025-12-05 12:53:24.638 187164 INFO nova.virt.libvirt.driver [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Dec  5 07:53:24 np0005546954 nova_compute[187160]: 2025-12-05 12:53:24.644 187164 DEBUG nova.compute.manager [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:53:24 np0005546954 nova_compute[187160]: 2025-12-05 12:53:24.674 187164 DEBUG nova.objects.instance [None req-7c045827-764f-495d-8f1d-9b14ea7da8b6 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Dec  5 07:53:25 np0005546954 nova_compute[187160]: 2025-12-05 12:53:25.236 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:27 np0005546954 podman[212659]: 2025-12-05 12:53:27.569594217 +0000 UTC m=+0.067007441 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  5 07:53:27 np0005546954 nova_compute[187160]: 2025-12-05 12:53:27.641 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:27 np0005546954 podman[212658]: 2025-12-05 12:53:27.66286244 +0000 UTC m=+0.160250473 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:53:28 np0005546954 nova_compute[187160]: 2025-12-05 12:53:28.414 187164 DEBUG oslo_concurrency.lockutils [None req-1c940969-8ae3-4d95-8c1b-e184e977082e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "7b88fad7-6e01-4547-a1cc-39dff2d6f7e0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:53:28 np0005546954 nova_compute[187160]: 2025-12-05 12:53:28.415 187164 DEBUG oslo_concurrency.lockutils [None req-1c940969-8ae3-4d95-8c1b-e184e977082e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "7b88fad7-6e01-4547-a1cc-39dff2d6f7e0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:53:28 np0005546954 nova_compute[187160]: 2025-12-05 12:53:28.416 187164 DEBUG oslo_concurrency.lockutils [None req-1c940969-8ae3-4d95-8c1b-e184e977082e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "7b88fad7-6e01-4547-a1cc-39dff2d6f7e0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:53:28 np0005546954 nova_compute[187160]: 2025-12-05 12:53:28.416 187164 DEBUG oslo_concurrency.lockutils [None req-1c940969-8ae3-4d95-8c1b-e184e977082e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "7b88fad7-6e01-4547-a1cc-39dff2d6f7e0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:53:28 np0005546954 nova_compute[187160]: 2025-12-05 12:53:28.417 187164 DEBUG oslo_concurrency.lockutils [None req-1c940969-8ae3-4d95-8c1b-e184e977082e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "7b88fad7-6e01-4547-a1cc-39dff2d6f7e0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:53:28 np0005546954 nova_compute[187160]: 2025-12-05 12:53:28.419 187164 INFO nova.compute.manager [None req-1c940969-8ae3-4d95-8c1b-e184e977082e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] Terminating instance#033[00m
Dec  5 07:53:28 np0005546954 nova_compute[187160]: 2025-12-05 12:53:28.420 187164 DEBUG nova.compute.manager [None req-1c940969-8ae3-4d95-8c1b-e184e977082e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:53:28 np0005546954 kernel: tapa0d238fe-39 (unregistering): left promiscuous mode
Dec  5 07:53:28 np0005546954 NetworkManager[55665]: <info>  [1764939208.4490] device (tapa0d238fe-39): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:53:28 np0005546954 ovn_controller[95566]: 2025-12-05T12:53:28Z|00119|binding|INFO|Releasing lport a0d238fe-3933-4631-8173-efaff134884c from this chassis (sb_readonly=0)
Dec  5 07:53:28 np0005546954 nova_compute[187160]: 2025-12-05 12:53:28.463 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:28 np0005546954 ovn_controller[95566]: 2025-12-05T12:53:28Z|00120|binding|INFO|Setting lport a0d238fe-3933-4631-8173-efaff134884c down in Southbound
Dec  5 07:53:28 np0005546954 ovn_controller[95566]: 2025-12-05T12:53:28Z|00121|binding|INFO|Removing iface tapa0d238fe-39 ovn-installed in OVS
Dec  5 07:53:28 np0005546954 nova_compute[187160]: 2025-12-05 12:53:28.466 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:28 np0005546954 nova_compute[187160]: 2025-12-05 12:53:28.510 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:28 np0005546954 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Dec  5 07:53:28 np0005546954 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000c.scope: Consumed 1.781s CPU time.
Dec  5 07:53:28 np0005546954 systemd-machined[153497]: Machine qemu-11-instance-0000000c terminated.
Dec  5 07:53:28 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:28.564 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:62:aa 10.100.0.8'], port_security=['fa:16:3e:dd:62:aa 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '7b88fad7-6e01-4547-a1cc-39dff2d6f7e0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4389bc8-2898-48b0-9741-5183b54fe83c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e6ae0d0dcde04b85b6dae45560cca988', 'neutron:revision_number': '11', 'neutron:security_group_ids': '9ea68f98-ae7c-4c35-bc5a-7c1a27f7e5f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb60c317-acba-4c06-b29b-f7c6c7a5660a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=a0d238fe-3933-4631-8173-efaff134884c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:53:28 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:28.566 104428 INFO neutron.agent.ovn.metadata.agent [-] Port a0d238fe-3933-4631-8173-efaff134884c in datapath d4389bc8-2898-48b0-9741-5183b54fe83c unbound from our chassis#033[00m
Dec  5 07:53:28 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:28.568 104428 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d4389bc8-2898-48b0-9741-5183b54fe83c#033[00m
Dec  5 07:53:28 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:28.583 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[c2f5f33f-ad85-42fd-bdda-feaa60e5b48a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:53:28 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:28.634 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[0525655b-5613-49ff-82e0-e54c3f30cf3c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:53:28 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:28.638 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[21f19b8b-f070-48d3-b376-83909a9d708c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:53:28 np0005546954 kernel: tapa0d238fe-39: entered promiscuous mode
Dec  5 07:53:28 np0005546954 kernel: tapa0d238fe-39 (unregistering): left promiscuous mode
Dec  5 07:53:28 np0005546954 NetworkManager[55665]: <info>  [1764939208.6502] manager: (tapa0d238fe-39): new Tun device (/org/freedesktop/NetworkManager/Devices/52)
Dec  5 07:53:28 np0005546954 ovn_controller[95566]: 2025-12-05T12:53:28Z|00122|binding|INFO|Claiming lport a0d238fe-3933-4631-8173-efaff134884c for this chassis.
Dec  5 07:53:28 np0005546954 ovn_controller[95566]: 2025-12-05T12:53:28Z|00123|binding|INFO|a0d238fe-3933-4631-8173-efaff134884c: Claiming fa:16:3e:dd:62:aa 10.100.0.8
Dec  5 07:53:28 np0005546954 nova_compute[187160]: 2025-12-05 12:53:28.692 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:28 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:28.699 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:62:aa 10.100.0.8'], port_security=['fa:16:3e:dd:62:aa 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '7b88fad7-6e01-4547-a1cc-39dff2d6f7e0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4389bc8-2898-48b0-9741-5183b54fe83c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e6ae0d0dcde04b85b6dae45560cca988', 'neutron:revision_number': '11', 'neutron:security_group_ids': '9ea68f98-ae7c-4c35-bc5a-7c1a27f7e5f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb60c317-acba-4c06-b29b-f7c6c7a5660a, chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=a0d238fe-3933-4631-8173-efaff134884c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:53:28 np0005546954 ovn_controller[95566]: 2025-12-05T12:53:28Z|00124|binding|INFO|Setting lport a0d238fe-3933-4631-8173-efaff134884c ovn-installed in OVS
Dec  5 07:53:28 np0005546954 ovn_controller[95566]: 2025-12-05T12:53:28Z|00125|binding|INFO|Setting lport a0d238fe-3933-4631-8173-efaff134884c up in Southbound
Dec  5 07:53:28 np0005546954 nova_compute[187160]: 2025-12-05 12:53:28.709 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:28 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:28.711 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[66064f6a-238b-43d0-b61f-14342dcc175d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:53:28 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:28.731 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[92dad867-65a8-416b-a1d3-e111b3193965]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd4389bc8-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:43:f7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 414675, 'reachable_time': 31640, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212730, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:53:28 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:28.748 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[1a3ad356-8d57-4535-bb0e-6a80d953e098]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd4389bc8-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 414687, 'tstamp': 414687}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212736, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd4389bc8-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 414691, 'tstamp': 414691}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212736, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:53:28 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:28.750 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4389bc8-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:53:28 np0005546954 nova_compute[187160]: 2025-12-05 12:53:28.750 187164 INFO nova.virt.libvirt.driver [-] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] Instance destroyed successfully.#033[00m
Dec  5 07:53:28 np0005546954 nova_compute[187160]: 2025-12-05 12:53:28.750 187164 DEBUG nova.objects.instance [None req-1c940969-8ae3-4d95-8c1b-e184e977082e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lazy-loading 'resources' on Instance uuid 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:53:28 np0005546954 nova_compute[187160]: 2025-12-05 12:53:28.752 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:28 np0005546954 ovn_controller[95566]: 2025-12-05T12:53:28Z|00126|binding|INFO|Releasing lport a0d238fe-3933-4631-8173-efaff134884c from this chassis (sb_readonly=0)
Dec  5 07:53:28 np0005546954 ovn_controller[95566]: 2025-12-05T12:53:28Z|00127|binding|INFO|Setting lport a0d238fe-3933-4631-8173-efaff134884c down in Southbound
Dec  5 07:53:28 np0005546954 ovn_controller[95566]: 2025-12-05T12:53:28Z|00128|binding|INFO|Removing iface tapa0d238fe-39 ovn-installed in OVS
Dec  5 07:53:28 np0005546954 nova_compute[187160]: 2025-12-05 12:53:28.757 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:28 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:28.758 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd4389bc8-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:53:28 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:28.758 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:53:28 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:28.758 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd4389bc8-20, col_values=(('external_ids', {'iface-id': '8dbe2af5-9f18-44ca-8f22-66854bcdd596'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:53:28 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:28.759 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:53:28 np0005546954 nova_compute[187160]: 2025-12-05 12:53:28.759 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:28 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:28.760 104428 INFO neutron.agent.ovn.metadata.agent [-] Port a0d238fe-3933-4631-8173-efaff134884c in datapath d4389bc8-2898-48b0-9741-5183b54fe83c unbound from our chassis#033[00m
Dec  5 07:53:28 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:28.761 104428 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d4389bc8-2898-48b0-9741-5183b54fe83c#033[00m
Dec  5 07:53:28 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:28.763 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:62:aa 10.100.0.8'], port_security=['fa:16:3e:dd:62:aa 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '7b88fad7-6e01-4547-a1cc-39dff2d6f7e0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4389bc8-2898-48b0-9741-5183b54fe83c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e6ae0d0dcde04b85b6dae45560cca988', 'neutron:revision_number': '11', 'neutron:security_group_ids': '9ea68f98-ae7c-4c35-bc5a-7c1a27f7e5f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb60c317-acba-4c06-b29b-f7c6c7a5660a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=a0d238fe-3933-4631-8173-efaff134884c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:53:28 np0005546954 nova_compute[187160]: 2025-12-05 12:53:28.768 187164 DEBUG nova.virt.libvirt.vif [None req-1c940969-8ae3-4d95-8c1b-e184e977082e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T12:52:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-795851383',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-795851383',id=12,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:52:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e6ae0d0dcde04b85b6dae45560cca988',ramdisk_id='',reservation_id='r-xt4ngszi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-192029678',owner_user_name='tempest-TestExecuteStrategies-192029678-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:53:24Z,user_data=None,user_id='0ae0bb20ac8b4be99eb1abddc7310436',uuid=7b88fad7-6e01-4547-a1cc-39dff2d6f7e0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a0d238fe-3933-4631-8173-efaff134884c", "address": "fa:16:3e:dd:62:aa", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0d238fe-39", "ovs_interfaceid": "a0d238fe-3933-4631-8173-efaff134884c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:53:28 np0005546954 nova_compute[187160]: 2025-12-05 12:53:28.768 187164 DEBUG nova.network.os_vif_util [None req-1c940969-8ae3-4d95-8c1b-e184e977082e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converting VIF {"id": "a0d238fe-3933-4631-8173-efaff134884c", "address": "fa:16:3e:dd:62:aa", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0d238fe-39", "ovs_interfaceid": "a0d238fe-3933-4631-8173-efaff134884c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:53:28 np0005546954 nova_compute[187160]: 2025-12-05 12:53:28.768 187164 DEBUG nova.network.os_vif_util [None req-1c940969-8ae3-4d95-8c1b-e184e977082e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:dd:62:aa,bridge_name='br-int',has_traffic_filtering=True,id=a0d238fe-3933-4631-8173-efaff134884c,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa0d238fe-39') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:53:28 np0005546954 nova_compute[187160]: 2025-12-05 12:53:28.769 187164 DEBUG os_vif [None req-1c940969-8ae3-4d95-8c1b-e184e977082e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:dd:62:aa,bridge_name='br-int',has_traffic_filtering=True,id=a0d238fe-3933-4631-8173-efaff134884c,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa0d238fe-39') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:53:28 np0005546954 nova_compute[187160]: 2025-12-05 12:53:28.770 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:28 np0005546954 nova_compute[187160]: 2025-12-05 12:53:28.771 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa0d238fe-39, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:53:28 np0005546954 nova_compute[187160]: 2025-12-05 12:53:28.772 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:28 np0005546954 nova_compute[187160]: 2025-12-05 12:53:28.772 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:53:28 np0005546954 nova_compute[187160]: 2025-12-05 12:53:28.773 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:28 np0005546954 nova_compute[187160]: 2025-12-05 12:53:28.775 187164 INFO os_vif [None req-1c940969-8ae3-4d95-8c1b-e184e977082e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:dd:62:aa,bridge_name='br-int',has_traffic_filtering=True,id=a0d238fe-3933-4631-8173-efaff134884c,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa0d238fe-39')#033[00m
Dec  5 07:53:28 np0005546954 nova_compute[187160]: 2025-12-05 12:53:28.775 187164 INFO nova.virt.libvirt.driver [None req-1c940969-8ae3-4d95-8c1b-e184e977082e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] Deleting instance files /var/lib/nova/instances/7b88fad7-6e01-4547-a1cc-39dff2d6f7e0_del#033[00m
Dec  5 07:53:28 np0005546954 nova_compute[187160]: 2025-12-05 12:53:28.776 187164 INFO nova.virt.libvirt.driver [None req-1c940969-8ae3-4d95-8c1b-e184e977082e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] Deletion of /var/lib/nova/instances/7b88fad7-6e01-4547-a1cc-39dff2d6f7e0_del complete#033[00m
Dec  5 07:53:28 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:28.777 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[1e0d3c0f-94c5-4d2d-915b-ccfac3fd71e1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:53:28 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:28.806 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[8a374ce1-100f-45db-95f2-7348fafdc8a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:53:28 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:28.809 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[1877f9d0-d143-4e79-b26f-91aed0c8f7cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:53:28 np0005546954 nova_compute[187160]: 2025-12-05 12:53:28.826 187164 INFO nova.compute.manager [None req-1c940969-8ae3-4d95-8c1b-e184e977082e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:53:28 np0005546954 nova_compute[187160]: 2025-12-05 12:53:28.826 187164 DEBUG oslo.service.loopingcall [None req-1c940969-8ae3-4d95-8c1b-e184e977082e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:53:28 np0005546954 nova_compute[187160]: 2025-12-05 12:53:28.827 187164 DEBUG nova.compute.manager [-] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:53:28 np0005546954 nova_compute[187160]: 2025-12-05 12:53:28.827 187164 DEBUG nova.network.neutron [-] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:53:28 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:28.832 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[406863a2-9222-4bf8-a347-8491b087ab7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:53:28 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:28.858 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[4133265a-8af6-4351-bf2b-c62975828f36]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd4389bc8-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:43:f7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 9, 'rx_bytes': 1456, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 9, 'rx_bytes': 1456, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 414675, 'reachable_time': 31640, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212744, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:53:28 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:28.879 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[0c48ee66-12a7-4669-a4ca-c04800cbe38c]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd4389bc8-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 414687, 'tstamp': 414687}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212745, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd4389bc8-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 414691, 'tstamp': 414691}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212745, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:53:28 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:28.882 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4389bc8-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:53:28 np0005546954 nova_compute[187160]: 2025-12-05 12:53:28.884 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:28 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:28.886 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd4389bc8-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:53:28 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:28.886 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:53:28 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:28.887 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd4389bc8-20, col_values=(('external_ids', {'iface-id': '8dbe2af5-9f18-44ca-8f22-66854bcdd596'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:53:28 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:28.887 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:53:28 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:28.888 104428 INFO neutron.agent.ovn.metadata.agent [-] Port a0d238fe-3933-4631-8173-efaff134884c in datapath d4389bc8-2898-48b0-9741-5183b54fe83c unbound from our chassis#033[00m
Dec  5 07:53:28 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:28.889 104428 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d4389bc8-2898-48b0-9741-5183b54fe83c#033[00m
Dec  5 07:53:28 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:28.906 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[59d88970-75c3-4b7f-b1a8-748bcef39c80]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:53:28 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:28.945 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[250e7246-d537-499d-bf46-0c3e97df8c90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:53:28 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:28.949 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[fa7bad39-60b6-4f0b-9fe9-4925ce3e6da0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:53:28 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:28.987 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[9a78a643-6968-4242-ac8f-7752ac373e5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:53:29 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:29.005 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[d552c666-17db-441e-9ce4-a79a037358da]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd4389bc8-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:43:f7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 11, 'rx_bytes': 1456, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 11, 'rx_bytes': 1456, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 414675, 'reachable_time': 31640, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212751, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:53:29 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:29.022 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[aa3a5cf2-4b85-4267-9dad-4a486da544dd]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd4389bc8-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 414687, 'tstamp': 414687}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212752, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd4389bc8-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 414691, 'tstamp': 414691}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212752, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:53:29 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:29.024 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4389bc8-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:53:29 np0005546954 nova_compute[187160]: 2025-12-05 12:53:29.025 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:29 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:29.027 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd4389bc8-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:53:29 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:29.027 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:53:29 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:29.027 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd4389bc8-20, col_values=(('external_ids', {'iface-id': '8dbe2af5-9f18-44ca-8f22-66854bcdd596'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:53:29 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:29.028 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:53:29 np0005546954 nova_compute[187160]: 2025-12-05 12:53:29.175 187164 DEBUG nova.compute.manager [req-1d7cdcb9-5509-404e-9bb4-7f2ba0b548d2 req-7347d63f-ed31-4352-893b-a978c6c92c93 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] Received event network-vif-unplugged-a0d238fe-3933-4631-8173-efaff134884c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:53:29 np0005546954 nova_compute[187160]: 2025-12-05 12:53:29.175 187164 DEBUG oslo_concurrency.lockutils [req-1d7cdcb9-5509-404e-9bb4-7f2ba0b548d2 req-7347d63f-ed31-4352-893b-a978c6c92c93 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "7b88fad7-6e01-4547-a1cc-39dff2d6f7e0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:53:29 np0005546954 nova_compute[187160]: 2025-12-05 12:53:29.175 187164 DEBUG oslo_concurrency.lockutils [req-1d7cdcb9-5509-404e-9bb4-7f2ba0b548d2 req-7347d63f-ed31-4352-893b-a978c6c92c93 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "7b88fad7-6e01-4547-a1cc-39dff2d6f7e0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:53:29 np0005546954 nova_compute[187160]: 2025-12-05 12:53:29.175 187164 DEBUG oslo_concurrency.lockutils [req-1d7cdcb9-5509-404e-9bb4-7f2ba0b548d2 req-7347d63f-ed31-4352-893b-a978c6c92c93 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "7b88fad7-6e01-4547-a1cc-39dff2d6f7e0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:53:29 np0005546954 nova_compute[187160]: 2025-12-05 12:53:29.176 187164 DEBUG nova.compute.manager [req-1d7cdcb9-5509-404e-9bb4-7f2ba0b548d2 req-7347d63f-ed31-4352-893b-a978c6c92c93 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] No waiting events found dispatching network-vif-unplugged-a0d238fe-3933-4631-8173-efaff134884c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:53:29 np0005546954 nova_compute[187160]: 2025-12-05 12:53:29.176 187164 DEBUG nova.compute.manager [req-1d7cdcb9-5509-404e-9bb4-7f2ba0b548d2 req-7347d63f-ed31-4352-893b-a978c6c92c93 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] Received event network-vif-unplugged-a0d238fe-3933-4631-8173-efaff134884c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  5 07:53:29 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:29.499 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2a:56:46', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '32:90:88:ab:74:32'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:53:29 np0005546954 nova_compute[187160]: 2025-12-05 12:53:29.500 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:29 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:29.501 104428 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 07:53:29 np0005546954 nova_compute[187160]: 2025-12-05 12:53:29.544 187164 DEBUG nova.network.neutron [-] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:53:29 np0005546954 nova_compute[187160]: 2025-12-05 12:53:29.565 187164 INFO nova.compute.manager [-] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] Took 0.74 seconds to deallocate network for instance.#033[00m
Dec  5 07:53:29 np0005546954 nova_compute[187160]: 2025-12-05 12:53:29.615 187164 DEBUG oslo_concurrency.lockutils [None req-1c940969-8ae3-4d95-8c1b-e184e977082e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:53:29 np0005546954 nova_compute[187160]: 2025-12-05 12:53:29.616 187164 DEBUG oslo_concurrency.lockutils [None req-1c940969-8ae3-4d95-8c1b-e184e977082e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:53:29 np0005546954 nova_compute[187160]: 2025-12-05 12:53:29.619 187164 DEBUG nova.compute.manager [req-234f5a74-4848-48a8-9bc0-5850a9c5584f req-ac84995a-4aef-4fbc-97fa-8cf09fb073b7 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] Received event network-vif-deleted-a0d238fe-3933-4631-8173-efaff134884c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:53:29 np0005546954 nova_compute[187160]: 2025-12-05 12:53:29.625 187164 DEBUG oslo_concurrency.lockutils [None req-1c940969-8ae3-4d95-8c1b-e184e977082e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.009s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:53:29 np0005546954 nova_compute[187160]: 2025-12-05 12:53:29.658 187164 INFO nova.scheduler.client.report [None req-1c940969-8ae3-4d95-8c1b-e184e977082e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Deleted allocations for instance 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0#033[00m
Dec  5 07:53:29 np0005546954 nova_compute[187160]: 2025-12-05 12:53:29.732 187164 DEBUG oslo_concurrency.lockutils [None req-1c940969-8ae3-4d95-8c1b-e184e977082e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "7b88fad7-6e01-4547-a1cc-39dff2d6f7e0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.317s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:53:30 np0005546954 nova_compute[187160]: 2025-12-05 12:53:30.238 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:30 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:30.504 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f9f74c-08f9-451f-9678-93bb9e8fa2fe, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:53:30 np0005546954 nova_compute[187160]: 2025-12-05 12:53:30.636 187164 DEBUG oslo_concurrency.lockutils [None req-3ce38d46-ae09-4fd3-87d8-cd001b47c649 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "d8c06fde-c17f-425f-a419-71aa3687ce9d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:53:30 np0005546954 nova_compute[187160]: 2025-12-05 12:53:30.636 187164 DEBUG oslo_concurrency.lockutils [None req-3ce38d46-ae09-4fd3-87d8-cd001b47c649 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "d8c06fde-c17f-425f-a419-71aa3687ce9d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:53:30 np0005546954 nova_compute[187160]: 2025-12-05 12:53:30.637 187164 DEBUG oslo_concurrency.lockutils [None req-3ce38d46-ae09-4fd3-87d8-cd001b47c649 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "d8c06fde-c17f-425f-a419-71aa3687ce9d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:53:30 np0005546954 nova_compute[187160]: 2025-12-05 12:53:30.637 187164 DEBUG oslo_concurrency.lockutils [None req-3ce38d46-ae09-4fd3-87d8-cd001b47c649 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "d8c06fde-c17f-425f-a419-71aa3687ce9d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:53:30 np0005546954 nova_compute[187160]: 2025-12-05 12:53:30.637 187164 DEBUG oslo_concurrency.lockutils [None req-3ce38d46-ae09-4fd3-87d8-cd001b47c649 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "d8c06fde-c17f-425f-a419-71aa3687ce9d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:53:30 np0005546954 nova_compute[187160]: 2025-12-05 12:53:30.638 187164 INFO nova.compute.manager [None req-3ce38d46-ae09-4fd3-87d8-cd001b47c649 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Terminating instance#033[00m
Dec  5 07:53:30 np0005546954 nova_compute[187160]: 2025-12-05 12:53:30.639 187164 DEBUG nova.compute.manager [None req-3ce38d46-ae09-4fd3-87d8-cd001b47c649 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:53:30 np0005546954 kernel: tapd430a736-69 (unregistering): left promiscuous mode
Dec  5 07:53:30 np0005546954 NetworkManager[55665]: <info>  [1764939210.6666] device (tapd430a736-69): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:53:30 np0005546954 ovn_controller[95566]: 2025-12-05T12:53:30Z|00129|binding|INFO|Releasing lport d430a736-69d8-44c8-a904-c0dd5d569dd7 from this chassis (sb_readonly=0)
Dec  5 07:53:30 np0005546954 ovn_controller[95566]: 2025-12-05T12:53:30Z|00130|binding|INFO|Setting lport d430a736-69d8-44c8-a904-c0dd5d569dd7 down in Southbound
Dec  5 07:53:30 np0005546954 nova_compute[187160]: 2025-12-05 12:53:30.674 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:30 np0005546954 ovn_controller[95566]: 2025-12-05T12:53:30Z|00131|binding|INFO|Removing iface tapd430a736-69 ovn-installed in OVS
Dec  5 07:53:30 np0005546954 nova_compute[187160]: 2025-12-05 12:53:30.677 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:30 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:30.682 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:57:12:ee 10.100.0.7'], port_security=['fa:16:3e:57:12:ee 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'd8c06fde-c17f-425f-a419-71aa3687ce9d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4389bc8-2898-48b0-9741-5183b54fe83c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e6ae0d0dcde04b85b6dae45560cca988', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9ea68f98-ae7c-4c35-bc5a-7c1a27f7e5f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb60c317-acba-4c06-b29b-f7c6c7a5660a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=d430a736-69d8-44c8-a904-c0dd5d569dd7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:53:30 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:30.684 104428 INFO neutron.agent.ovn.metadata.agent [-] Port d430a736-69d8-44c8-a904-c0dd5d569dd7 in datapath d4389bc8-2898-48b0-9741-5183b54fe83c unbound from our chassis#033[00m
Dec  5 07:53:30 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:30.686 104428 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d4389bc8-2898-48b0-9741-5183b54fe83c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:53:30 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:30.687 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[c8720571-f67a-4d46-a0d7-9b50498b946e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:53:30 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:30.688 104428 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c namespace which is not needed anymore#033[00m
Dec  5 07:53:30 np0005546954 nova_compute[187160]: 2025-12-05 12:53:30.691 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:30 np0005546954 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Dec  5 07:53:30 np0005546954 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000b.scope: Consumed 16.503s CPU time.
Dec  5 07:53:30 np0005546954 systemd-machined[153497]: Machine qemu-10-instance-0000000b terminated.
Dec  5 07:53:30 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[212298]: [NOTICE]   (212312) : haproxy version is 2.8.14-c23fe91
Dec  5 07:53:30 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[212298]: [NOTICE]   (212312) : path to executable is /usr/sbin/haproxy
Dec  5 07:53:30 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[212298]: [WARNING]  (212312) : Exiting Master process...
Dec  5 07:53:30 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[212298]: [ALERT]    (212312) : Current worker (212314) exited with code 143 (Terminated)
Dec  5 07:53:30 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[212298]: [WARNING]  (212312) : All workers exited. Exiting... (0)
Dec  5 07:53:30 np0005546954 systemd[1]: libpod-1f24008db5d17895f424ca346fc9f278bda3122a63689453654f18501dadce7b.scope: Deactivated successfully.
Dec  5 07:53:30 np0005546954 podman[212775]: 2025-12-05 12:53:30.86567714 +0000 UTC m=+0.075032213 container died 1f24008db5d17895f424ca346fc9f278bda3122a63689453654f18501dadce7b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:53:30 np0005546954 NetworkManager[55665]: <error> [1764939210.8658] platform-linux: error reading net:/sys/class/net/tapd430a736-69/dev_id: error reading 4096 bytes from file descriptor: Invalid argument
Dec  5 07:53:30 np0005546954 NetworkManager[55665]: <info>  [1764939210.8670] manager: (tapd430a736-69): new Tun device (/org/freedesktop/NetworkManager/Devices/53)
Dec  5 07:53:30 np0005546954 nova_compute[187160]: 2025-12-05 12:53:30.868 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:30 np0005546954 nova_compute[187160]: 2025-12-05 12:53:30.918 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:30 np0005546954 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1f24008db5d17895f424ca346fc9f278bda3122a63689453654f18501dadce7b-userdata-shm.mount: Deactivated successfully.
Dec  5 07:53:30 np0005546954 systemd[1]: var-lib-containers-storage-overlay-ad9161cf92b7f699eab1b887fa35832cea80dd03eae71f040429509a05f1a38f-merged.mount: Deactivated successfully.
Dec  5 07:53:30 np0005546954 nova_compute[187160]: 2025-12-05 12:53:30.959 187164 INFO nova.virt.libvirt.driver [-] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Instance destroyed successfully.#033[00m
Dec  5 07:53:30 np0005546954 nova_compute[187160]: 2025-12-05 12:53:30.960 187164 DEBUG nova.objects.instance [None req-3ce38d46-ae09-4fd3-87d8-cd001b47c649 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lazy-loading 'resources' on Instance uuid d8c06fde-c17f-425f-a419-71aa3687ce9d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:53:30 np0005546954 nova_compute[187160]: 2025-12-05 12:53:30.981 187164 DEBUG nova.virt.libvirt.vif [None req-3ce38d46-ae09-4fd3-87d8-cd001b47c649 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:52:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-452631592',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-452631592',id=11,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:52:20Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e6ae0d0dcde04b85b6dae45560cca988',ramdisk_id='',reservation_id='r-5q5hgc3u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-192029678',owner_user_name='tempest-TestExecuteStrategies-192029678-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:52:21Z,user_data=None,user_id='0ae0bb20ac8b4be99eb1abddc7310436',uuid=d8c06fde-c17f-425f-a419-71aa3687ce9d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d430a736-69d8-44c8-a904-c0dd5d569dd7", "address": "fa:16:3e:57:12:ee", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd430a736-69", "ovs_interfaceid": "d430a736-69d8-44c8-a904-c0dd5d569dd7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:53:30 np0005546954 nova_compute[187160]: 2025-12-05 12:53:30.982 187164 DEBUG nova.network.os_vif_util [None req-3ce38d46-ae09-4fd3-87d8-cd001b47c649 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converting VIF {"id": "d430a736-69d8-44c8-a904-c0dd5d569dd7", "address": "fa:16:3e:57:12:ee", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd430a736-69", "ovs_interfaceid": "d430a736-69d8-44c8-a904-c0dd5d569dd7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:53:30 np0005546954 nova_compute[187160]: 2025-12-05 12:53:30.983 187164 DEBUG nova.network.os_vif_util [None req-3ce38d46-ae09-4fd3-87d8-cd001b47c649 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:57:12:ee,bridge_name='br-int',has_traffic_filtering=True,id=d430a736-69d8-44c8-a904-c0dd5d569dd7,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd430a736-69') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:53:30 np0005546954 nova_compute[187160]: 2025-12-05 12:53:30.984 187164 DEBUG os_vif [None req-3ce38d46-ae09-4fd3-87d8-cd001b47c649 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:57:12:ee,bridge_name='br-int',has_traffic_filtering=True,id=d430a736-69d8-44c8-a904-c0dd5d569dd7,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd430a736-69') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:53:30 np0005546954 nova_compute[187160]: 2025-12-05 12:53:30.986 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:30 np0005546954 nova_compute[187160]: 2025-12-05 12:53:30.987 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd430a736-69, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.016 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.019 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.022 187164 INFO os_vif [None req-3ce38d46-ae09-4fd3-87d8-cd001b47c649 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:57:12:ee,bridge_name='br-int',has_traffic_filtering=True,id=d430a736-69d8-44c8-a904-c0dd5d569dd7,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd430a736-69')#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.023 187164 INFO nova.virt.libvirt.driver [None req-3ce38d46-ae09-4fd3-87d8-cd001b47c649 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Deleting instance files /var/lib/nova/instances/d8c06fde-c17f-425f-a419-71aa3687ce9d_del#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.024 187164 INFO nova.virt.libvirt.driver [None req-3ce38d46-ae09-4fd3-87d8-cd001b47c649 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Deletion of /var/lib/nova/instances/d8c06fde-c17f-425f-a419-71aa3687ce9d_del complete#033[00m
Dec  5 07:53:31 np0005546954 podman[212775]: 2025-12-05 12:53:31.079676935 +0000 UTC m=+0.289032008 container cleanup 1f24008db5d17895f424ca346fc9f278bda3122a63689453654f18501dadce7b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:53:31 np0005546954 systemd[1]: libpod-conmon-1f24008db5d17895f424ca346fc9f278bda3122a63689453654f18501dadce7b.scope: Deactivated successfully.
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.104 187164 INFO nova.compute.manager [None req-3ce38d46-ae09-4fd3-87d8-cd001b47c649 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Took 0.46 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.104 187164 DEBUG oslo.service.loopingcall [None req-3ce38d46-ae09-4fd3-87d8-cd001b47c649 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.105 187164 DEBUG nova.compute.manager [-] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.105 187164 DEBUG nova.network.neutron [-] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:53:31 np0005546954 podman[212816]: 2025-12-05 12:53:31.171243994 +0000 UTC m=+0.063406157 container remove 1f24008db5d17895f424ca346fc9f278bda3122a63689453654f18501dadce7b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  5 07:53:31 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:31.176 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[9c001702-b9cc-421e-83ee-39d080fcdce0]: (4, ('Fri Dec  5 12:53:30 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c (1f24008db5d17895f424ca346fc9f278bda3122a63689453654f18501dadce7b)\n1f24008db5d17895f424ca346fc9f278bda3122a63689453654f18501dadce7b\nFri Dec  5 12:53:31 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c (1f24008db5d17895f424ca346fc9f278bda3122a63689453654f18501dadce7b)\n1f24008db5d17895f424ca346fc9f278bda3122a63689453654f18501dadce7b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:53:31 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:31.178 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[2fa28d22-7c71-4619-9edd-dcbc4554d40a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:53:31 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:31.179 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4389bc8-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.180 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:31 np0005546954 kernel: tapd4389bc8-20: left promiscuous mode
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.182 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:31 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:31.184 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[32948252-0ea9-48d9-86c3-3489be117336]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.195 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:31 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:31.207 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[f5cdec7f-e905-4e91-98b5-fdd749046245]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:53:31 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:31.209 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[ce17adff-331a-479a-832b-57ad3cfcac58]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:53:31 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:31.224 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[e2393e89-b7f9-4571-b0de-adf967c1d4ef]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 414667, 'reachable_time': 25779, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212831, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:53:31 np0005546954 systemd[1]: run-netns-ovnmeta\x2dd4389bc8\x2d2898\x2d48b0\x2d9741\x2d5183b54fe83c.mount: Deactivated successfully.
Dec  5 07:53:31 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:31.229 104542 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:53:31 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:53:31.230 104542 DEBUG oslo.privsep.daemon [-] privsep: reply[d4efabe7-f231-4caf-8075-525fdf38eb1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.294 187164 DEBUG nova.compute.manager [req-26868b55-773d-4877-82aa-8418ce4eecae req-f2eb19ca-3c7e-40b6-a77f-a1adc5a61fd4 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] Received event network-vif-plugged-a0d238fe-3933-4631-8173-efaff134884c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.294 187164 DEBUG oslo_concurrency.lockutils [req-26868b55-773d-4877-82aa-8418ce4eecae req-f2eb19ca-3c7e-40b6-a77f-a1adc5a61fd4 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "7b88fad7-6e01-4547-a1cc-39dff2d6f7e0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.295 187164 DEBUG oslo_concurrency.lockutils [req-26868b55-773d-4877-82aa-8418ce4eecae req-f2eb19ca-3c7e-40b6-a77f-a1adc5a61fd4 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "7b88fad7-6e01-4547-a1cc-39dff2d6f7e0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.295 187164 DEBUG oslo_concurrency.lockutils [req-26868b55-773d-4877-82aa-8418ce4eecae req-f2eb19ca-3c7e-40b6-a77f-a1adc5a61fd4 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "7b88fad7-6e01-4547-a1cc-39dff2d6f7e0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.295 187164 DEBUG nova.compute.manager [req-26868b55-773d-4877-82aa-8418ce4eecae req-f2eb19ca-3c7e-40b6-a77f-a1adc5a61fd4 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] No waiting events found dispatching network-vif-plugged-a0d238fe-3933-4631-8173-efaff134884c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.296 187164 WARNING nova.compute.manager [req-26868b55-773d-4877-82aa-8418ce4eecae req-f2eb19ca-3c7e-40b6-a77f-a1adc5a61fd4 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] Received unexpected event network-vif-plugged-a0d238fe-3933-4631-8173-efaff134884c for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.296 187164 DEBUG nova.compute.manager [req-26868b55-773d-4877-82aa-8418ce4eecae req-f2eb19ca-3c7e-40b6-a77f-a1adc5a61fd4 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] Received event network-vif-plugged-a0d238fe-3933-4631-8173-efaff134884c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.296 187164 DEBUG oslo_concurrency.lockutils [req-26868b55-773d-4877-82aa-8418ce4eecae req-f2eb19ca-3c7e-40b6-a77f-a1adc5a61fd4 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "7b88fad7-6e01-4547-a1cc-39dff2d6f7e0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.297 187164 DEBUG oslo_concurrency.lockutils [req-26868b55-773d-4877-82aa-8418ce4eecae req-f2eb19ca-3c7e-40b6-a77f-a1adc5a61fd4 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "7b88fad7-6e01-4547-a1cc-39dff2d6f7e0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.297 187164 DEBUG oslo_concurrency.lockutils [req-26868b55-773d-4877-82aa-8418ce4eecae req-f2eb19ca-3c7e-40b6-a77f-a1adc5a61fd4 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "7b88fad7-6e01-4547-a1cc-39dff2d6f7e0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.297 187164 DEBUG nova.compute.manager [req-26868b55-773d-4877-82aa-8418ce4eecae req-f2eb19ca-3c7e-40b6-a77f-a1adc5a61fd4 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] No waiting events found dispatching network-vif-plugged-a0d238fe-3933-4631-8173-efaff134884c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.298 187164 WARNING nova.compute.manager [req-26868b55-773d-4877-82aa-8418ce4eecae req-f2eb19ca-3c7e-40b6-a77f-a1adc5a61fd4 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] Received unexpected event network-vif-plugged-a0d238fe-3933-4631-8173-efaff134884c for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.298 187164 DEBUG nova.compute.manager [req-26868b55-773d-4877-82aa-8418ce4eecae req-f2eb19ca-3c7e-40b6-a77f-a1adc5a61fd4 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] Received event network-vif-plugged-a0d238fe-3933-4631-8173-efaff134884c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.298 187164 DEBUG oslo_concurrency.lockutils [req-26868b55-773d-4877-82aa-8418ce4eecae req-f2eb19ca-3c7e-40b6-a77f-a1adc5a61fd4 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "7b88fad7-6e01-4547-a1cc-39dff2d6f7e0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.299 187164 DEBUG oslo_concurrency.lockutils [req-26868b55-773d-4877-82aa-8418ce4eecae req-f2eb19ca-3c7e-40b6-a77f-a1adc5a61fd4 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "7b88fad7-6e01-4547-a1cc-39dff2d6f7e0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.299 187164 DEBUG oslo_concurrency.lockutils [req-26868b55-773d-4877-82aa-8418ce4eecae req-f2eb19ca-3c7e-40b6-a77f-a1adc5a61fd4 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "7b88fad7-6e01-4547-a1cc-39dff2d6f7e0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.299 187164 DEBUG nova.compute.manager [req-26868b55-773d-4877-82aa-8418ce4eecae req-f2eb19ca-3c7e-40b6-a77f-a1adc5a61fd4 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] No waiting events found dispatching network-vif-plugged-a0d238fe-3933-4631-8173-efaff134884c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.300 187164 WARNING nova.compute.manager [req-26868b55-773d-4877-82aa-8418ce4eecae req-f2eb19ca-3c7e-40b6-a77f-a1adc5a61fd4 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] Received unexpected event network-vif-plugged-a0d238fe-3933-4631-8173-efaff134884c for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.300 187164 DEBUG nova.compute.manager [req-26868b55-773d-4877-82aa-8418ce4eecae req-f2eb19ca-3c7e-40b6-a77f-a1adc5a61fd4 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] Received event network-vif-plugged-a0d238fe-3933-4631-8173-efaff134884c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.300 187164 DEBUG oslo_concurrency.lockutils [req-26868b55-773d-4877-82aa-8418ce4eecae req-f2eb19ca-3c7e-40b6-a77f-a1adc5a61fd4 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "7b88fad7-6e01-4547-a1cc-39dff2d6f7e0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.300 187164 DEBUG oslo_concurrency.lockutils [req-26868b55-773d-4877-82aa-8418ce4eecae req-f2eb19ca-3c7e-40b6-a77f-a1adc5a61fd4 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "7b88fad7-6e01-4547-a1cc-39dff2d6f7e0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.301 187164 DEBUG oslo_concurrency.lockutils [req-26868b55-773d-4877-82aa-8418ce4eecae req-f2eb19ca-3c7e-40b6-a77f-a1adc5a61fd4 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "7b88fad7-6e01-4547-a1cc-39dff2d6f7e0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.301 187164 DEBUG nova.compute.manager [req-26868b55-773d-4877-82aa-8418ce4eecae req-f2eb19ca-3c7e-40b6-a77f-a1adc5a61fd4 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] No waiting events found dispatching network-vif-plugged-a0d238fe-3933-4631-8173-efaff134884c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.301 187164 WARNING nova.compute.manager [req-26868b55-773d-4877-82aa-8418ce4eecae req-f2eb19ca-3c7e-40b6-a77f-a1adc5a61fd4 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] Received unexpected event network-vif-plugged-a0d238fe-3933-4631-8173-efaff134884c for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.701 187164 DEBUG nova.network.neutron [-] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.713 187164 DEBUG nova.compute.manager [req-bbd84ad1-cda6-49de-a211-8a1532419d75 req-e76af5e4-92d0-4d89-9c18-a75427645faa 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Received event network-vif-unplugged-d430a736-69d8-44c8-a904-c0dd5d569dd7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.714 187164 DEBUG oslo_concurrency.lockutils [req-bbd84ad1-cda6-49de-a211-8a1532419d75 req-e76af5e4-92d0-4d89-9c18-a75427645faa 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "d8c06fde-c17f-425f-a419-71aa3687ce9d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.714 187164 DEBUG oslo_concurrency.lockutils [req-bbd84ad1-cda6-49de-a211-8a1532419d75 req-e76af5e4-92d0-4d89-9c18-a75427645faa 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "d8c06fde-c17f-425f-a419-71aa3687ce9d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.715 187164 DEBUG oslo_concurrency.lockutils [req-bbd84ad1-cda6-49de-a211-8a1532419d75 req-e76af5e4-92d0-4d89-9c18-a75427645faa 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "d8c06fde-c17f-425f-a419-71aa3687ce9d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.715 187164 DEBUG nova.compute.manager [req-bbd84ad1-cda6-49de-a211-8a1532419d75 req-e76af5e4-92d0-4d89-9c18-a75427645faa 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] No waiting events found dispatching network-vif-unplugged-d430a736-69d8-44c8-a904-c0dd5d569dd7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.716 187164 DEBUG nova.compute.manager [req-bbd84ad1-cda6-49de-a211-8a1532419d75 req-e76af5e4-92d0-4d89-9c18-a75427645faa 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Received event network-vif-unplugged-d430a736-69d8-44c8-a904-c0dd5d569dd7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.716 187164 DEBUG nova.compute.manager [req-bbd84ad1-cda6-49de-a211-8a1532419d75 req-e76af5e4-92d0-4d89-9c18-a75427645faa 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Received event network-vif-plugged-d430a736-69d8-44c8-a904-c0dd5d569dd7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.716 187164 DEBUG oslo_concurrency.lockutils [req-bbd84ad1-cda6-49de-a211-8a1532419d75 req-e76af5e4-92d0-4d89-9c18-a75427645faa 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "d8c06fde-c17f-425f-a419-71aa3687ce9d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.717 187164 DEBUG oslo_concurrency.lockutils [req-bbd84ad1-cda6-49de-a211-8a1532419d75 req-e76af5e4-92d0-4d89-9c18-a75427645faa 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "d8c06fde-c17f-425f-a419-71aa3687ce9d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.717 187164 DEBUG oslo_concurrency.lockutils [req-bbd84ad1-cda6-49de-a211-8a1532419d75 req-e76af5e4-92d0-4d89-9c18-a75427645faa 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "d8c06fde-c17f-425f-a419-71aa3687ce9d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.718 187164 DEBUG nova.compute.manager [req-bbd84ad1-cda6-49de-a211-8a1532419d75 req-e76af5e4-92d0-4d89-9c18-a75427645faa 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] No waiting events found dispatching network-vif-plugged-d430a736-69d8-44c8-a904-c0dd5d569dd7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.718 187164 WARNING nova.compute.manager [req-bbd84ad1-cda6-49de-a211-8a1532419d75 req-e76af5e4-92d0-4d89-9c18-a75427645faa 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Received unexpected event network-vif-plugged-d430a736-69d8-44c8-a904-c0dd5d569dd7 for instance with vm_state active and task_state deleting.#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.724 187164 INFO nova.compute.manager [-] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Took 0.62 seconds to deallocate network for instance.#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.875 187164 DEBUG oslo_concurrency.lockutils [None req-3ce38d46-ae09-4fd3-87d8-cd001b47c649 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.876 187164 DEBUG oslo_concurrency.lockutils [None req-3ce38d46-ae09-4fd3-87d8-cd001b47c649 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.929 187164 DEBUG nova.compute.provider_tree [None req-3ce38d46-ae09-4fd3-87d8-cd001b47c649 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.944 187164 DEBUG nova.scheduler.client.report [None req-3ce38d46-ae09-4fd3-87d8-cd001b47c649 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.965 187164 DEBUG oslo_concurrency.lockutils [None req-3ce38d46-ae09-4fd3-87d8-cd001b47c649 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.089s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:53:31 np0005546954 nova_compute[187160]: 2025-12-05 12:53:31.989 187164 INFO nova.scheduler.client.report [None req-3ce38d46-ae09-4fd3-87d8-cd001b47c649 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Deleted allocations for instance d8c06fde-c17f-425f-a419-71aa3687ce9d#033[00m
Dec  5 07:53:32 np0005546954 nova_compute[187160]: 2025-12-05 12:53:32.045 187164 DEBUG oslo_concurrency.lockutils [None req-3ce38d46-ae09-4fd3-87d8-cd001b47c649 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "d8c06fde-c17f-425f-a419-71aa3687ce9d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.409s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:53:33 np0005546954 nova_compute[187160]: 2025-12-05 12:53:33.901 187164 DEBUG nova.compute.manager [req-df4df0d8-90cb-4006-8550-d9dce1cd0f26 req-f678dda0-9ac7-4dac-a640-cd03b366e288 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Received event network-vif-deleted-d430a736-69d8-44c8-a904-c0dd5d569dd7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:53:35 np0005546954 nova_compute[187160]: 2025-12-05 12:53:35.241 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:35 np0005546954 podman[197513]: time="2025-12-05T12:53:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:53:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:53:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 07:53:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:53:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2586 "" "Go-http-client/1.1"
Dec  5 07:53:36 np0005546954 nova_compute[187160]: 2025-12-05 12:53:36.017 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:40 np0005546954 nova_compute[187160]: 2025-12-05 12:53:40.244 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:40 np0005546954 podman[212832]: 2025-12-05 12:53:40.556942548 +0000 UTC m=+0.064350928 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41)
Dec  5 07:53:40 np0005546954 podman[212833]: 2025-12-05 12:53:40.571697641 +0000 UTC m=+0.069692426 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:53:41 np0005546954 nova_compute[187160]: 2025-12-05 12:53:41.019 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:43 np0005546954 nova_compute[187160]: 2025-12-05 12:53:43.747 187164 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764939208.745682, 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:53:43 np0005546954 nova_compute[187160]: 2025-12-05 12:53:43.747 187164 INFO nova.compute.manager [-] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:53:43 np0005546954 nova_compute[187160]: 2025-12-05 12:53:43.963 187164 DEBUG nova.compute.manager [None req-0d3ceec4-0539-41c0-a432-de4cda224489 - - - - - -] [instance: 7b88fad7-6e01-4547-a1cc-39dff2d6f7e0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:53:45 np0005546954 nova_compute[187160]: 2025-12-05 12:53:45.246 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:45 np0005546954 nova_compute[187160]: 2025-12-05 12:53:45.957 187164 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764939210.955207, d8c06fde-c17f-425f-a419-71aa3687ce9d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:53:45 np0005546954 nova_compute[187160]: 2025-12-05 12:53:45.957 187164 INFO nova.compute.manager [-] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:53:45 np0005546954 nova_compute[187160]: 2025-12-05 12:53:45.979 187164 DEBUG nova.compute.manager [None req-364c9386-7eed-4b21-8ad7-680dbfd25560 - - - - - -] [instance: d8c06fde-c17f-425f-a419-71aa3687ce9d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:53:46 np0005546954 nova_compute[187160]: 2025-12-05 12:53:46.021 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:53:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 07:53:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:53:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 07:53:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:53:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:53:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:53:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:53:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:53:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:53:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 07:53:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:53:50 np0005546954 nova_compute[187160]: 2025-12-05 12:53:50.249 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:51 np0005546954 nova_compute[187160]: 2025-12-05 12:53:51.024 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:53 np0005546954 podman[212875]: 2025-12-05 12:53:53.550866374 +0000 UTC m=+0.061964782 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:53:55 np0005546954 nova_compute[187160]: 2025-12-05 12:53:55.251 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:56 np0005546954 nova_compute[187160]: 2025-12-05 12:53:56.027 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:53:58 np0005546954 nova_compute[187160]: 2025-12-05 12:53:58.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:53:58 np0005546954 nova_compute[187160]: 2025-12-05 12:53:58.039 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:53:58 np0005546954 nova_compute[187160]: 2025-12-05 12:53:58.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:53:58 np0005546954 nova_compute[187160]: 2025-12-05 12:53:58.068 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 07:53:58 np0005546954 nova_compute[187160]: 2025-12-05 12:53:58.069 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:53:58 np0005546954 nova_compute[187160]: 2025-12-05 12:53:58.069 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:53:58 np0005546954 podman[212898]: 2025-12-05 12:53:58.57448209 +0000 UTC m=+0.066835575 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 07:53:58 np0005546954 podman[212897]: 2025-12-05 12:53:58.61692474 +0000 UTC m=+0.111981330 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  5 07:54:00 np0005546954 nova_compute[187160]: 2025-12-05 12:54:00.253 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:54:01 np0005546954 nova_compute[187160]: 2025-12-05 12:54:01.029 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:54:01 np0005546954 nova_compute[187160]: 2025-12-05 12:54:01.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:54:01 np0005546954 ovn_controller[95566]: 2025-12-05T12:54:01Z|00132|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Dec  5 07:54:05 np0005546954 nova_compute[187160]: 2025-12-05 12:54:05.033 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:54:05 np0005546954 nova_compute[187160]: 2025-12-05 12:54:05.256 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:54:05 np0005546954 podman[197513]: time="2025-12-05T12:54:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:54:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:54:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 07:54:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:54:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2589 "" "Go-http-client/1.1"
Dec  5 07:54:06 np0005546954 nova_compute[187160]: 2025-12-05 12:54:06.033 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:54:07 np0005546954 nova_compute[187160]: 2025-12-05 12:54:07.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:54:07 np0005546954 nova_compute[187160]: 2025-12-05 12:54:07.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:54:07 np0005546954 nova_compute[187160]: 2025-12-05 12:54:07.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:54:07 np0005546954 nova_compute[187160]: 2025-12-05 12:54:07.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:54:07 np0005546954 nova_compute[187160]: 2025-12-05 12:54:07.068 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:54:07 np0005546954 nova_compute[187160]: 2025-12-05 12:54:07.069 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:54:07 np0005546954 nova_compute[187160]: 2025-12-05 12:54:07.070 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:54:07 np0005546954 nova_compute[187160]: 2025-12-05 12:54:07.070 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:54:07 np0005546954 nova_compute[187160]: 2025-12-05 12:54:07.301 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:54:07 np0005546954 nova_compute[187160]: 2025-12-05 12:54:07.302 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5880MB free_disk=73.33632278442383GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:54:07 np0005546954 nova_compute[187160]: 2025-12-05 12:54:07.303 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:54:07 np0005546954 nova_compute[187160]: 2025-12-05 12:54:07.303 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:54:07 np0005546954 nova_compute[187160]: 2025-12-05 12:54:07.360 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:54:07 np0005546954 nova_compute[187160]: 2025-12-05 12:54:07.361 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:54:07 np0005546954 nova_compute[187160]: 2025-12-05 12:54:07.383 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Refreshing inventories for resource provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  5 07:54:07 np0005546954 nova_compute[187160]: 2025-12-05 12:54:07.406 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Updating ProviderTree inventory for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  5 07:54:07 np0005546954 nova_compute[187160]: 2025-12-05 12:54:07.407 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Updating inventory in ProviderTree for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  5 07:54:07 np0005546954 nova_compute[187160]: 2025-12-05 12:54:07.422 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Refreshing aggregate associations for resource provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  5 07:54:07 np0005546954 nova_compute[187160]: 2025-12-05 12:54:07.445 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Refreshing trait associations for resource provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b, traits: COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_IDE,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_2_0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  5 07:54:07 np0005546954 nova_compute[187160]: 2025-12-05 12:54:07.468 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:54:07 np0005546954 nova_compute[187160]: 2025-12-05 12:54:07.491 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:54:07 np0005546954 nova_compute[187160]: 2025-12-05 12:54:07.514 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:54:07 np0005546954 nova_compute[187160]: 2025-12-05 12:54:07.514 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.212s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:54:10 np0005546954 nova_compute[187160]: 2025-12-05 12:54:10.259 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:54:10 np0005546954 nova_compute[187160]: 2025-12-05 12:54:10.515 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:54:11 np0005546954 nova_compute[187160]: 2025-12-05 12:54:11.035 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:54:11 np0005546954 podman[212950]: 2025-12-05 12:54:11.581606312 +0000 UTC m=+0.076203250 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec  5 07:54:11 np0005546954 podman[212949]: 2025-12-05 12:54:11.590542012 +0000 UTC m=+0.087298257 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-type=git, architecture=x86_64, release=1755695350, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., version=9.6)
Dec  5 07:54:15 np0005546954 nova_compute[187160]: 2025-12-05 12:54:15.261 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:54:16 np0005546954 nova_compute[187160]: 2025-12-05 12:54:16.038 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:54:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:54:16.952 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:54:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:54:16.953 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:54:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:54:16.953 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:54:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:54:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 07:54:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:54:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:54:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:54:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:54:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:54:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 07:54:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:54:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:54:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 07:54:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:54:20 np0005546954 nova_compute[187160]: 2025-12-05 12:54:20.262 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:54:21 np0005546954 nova_compute[187160]: 2025-12-05 12:54:21.041 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:54:24 np0005546954 podman[212990]: 2025-12-05 12:54:24.541068858 +0000 UTC m=+0.057117361 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:54:24 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:54:24.842 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2a:56:46', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '32:90:88:ab:74:32'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:54:24 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:54:24.843 104428 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 07:54:24 np0005546954 nova_compute[187160]: 2025-12-05 12:54:24.881 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:54:25 np0005546954 nova_compute[187160]: 2025-12-05 12:54:25.264 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:54:26 np0005546954 nova_compute[187160]: 2025-12-05 12:54:26.043 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:54:26 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:54:26.847 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f9f74c-08f9-451f-9678-93bb9e8fa2fe, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:54:29 np0005546954 podman[213009]: 2025-12-05 12:54:29.574687168 +0000 UTC m=+0.075392414 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  5 07:54:29 np0005546954 podman[213008]: 2025-12-05 12:54:29.588770949 +0000 UTC m=+0.103398862 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller)
Dec  5 07:54:30 np0005546954 nova_compute[187160]: 2025-12-05 12:54:30.266 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:54:31 np0005546954 nova_compute[187160]: 2025-12-05 12:54:31.046 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:54:31 np0005546954 nova_compute[187160]: 2025-12-05 12:54:31.274 187164 DEBUG oslo_concurrency.lockutils [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:54:31 np0005546954 nova_compute[187160]: 2025-12-05 12:54:31.275 187164 DEBUG oslo_concurrency.lockutils [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:54:31 np0005546954 nova_compute[187160]: 2025-12-05 12:54:31.308 187164 DEBUG nova.compute.manager [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:54:31 np0005546954 nova_compute[187160]: 2025-12-05 12:54:31.374 187164 DEBUG oslo_concurrency.lockutils [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:54:31 np0005546954 nova_compute[187160]: 2025-12-05 12:54:31.375 187164 DEBUG oslo_concurrency.lockutils [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:54:31 np0005546954 nova_compute[187160]: 2025-12-05 12:54:31.381 187164 DEBUG nova.virt.hardware [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:54:31 np0005546954 nova_compute[187160]: 2025-12-05 12:54:31.381 187164 INFO nova.compute.claims [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Claim successful on node compute-1.ctlplane.example.com#033[00m
Dec  5 07:54:31 np0005546954 nova_compute[187160]: 2025-12-05 12:54:31.526 187164 DEBUG nova.compute.provider_tree [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:54:31 np0005546954 nova_compute[187160]: 2025-12-05 12:54:31.549 187164 DEBUG nova.scheduler.client.report [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:54:31 np0005546954 nova_compute[187160]: 2025-12-05 12:54:31.573 187164 DEBUG oslo_concurrency.lockutils [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.199s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:54:31 np0005546954 nova_compute[187160]: 2025-12-05 12:54:31.574 187164 DEBUG nova.compute.manager [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:54:31 np0005546954 nova_compute[187160]: 2025-12-05 12:54:31.631 187164 DEBUG nova.compute.manager [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:54:31 np0005546954 nova_compute[187160]: 2025-12-05 12:54:31.632 187164 DEBUG nova.network.neutron [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:54:31 np0005546954 nova_compute[187160]: 2025-12-05 12:54:31.664 187164 INFO nova.virt.libvirt.driver [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:54:31 np0005546954 nova_compute[187160]: 2025-12-05 12:54:31.697 187164 DEBUG nova.compute.manager [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:54:31 np0005546954 nova_compute[187160]: 2025-12-05 12:54:31.814 187164 DEBUG nova.policy [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0ae0bb20ac8b4be99eb1abddc7310436', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e6ae0d0dcde04b85b6dae45560cca988', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:54:31 np0005546954 nova_compute[187160]: 2025-12-05 12:54:31.847 187164 DEBUG nova.compute.manager [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:54:31 np0005546954 nova_compute[187160]: 2025-12-05 12:54:31.849 187164 DEBUG nova.virt.libvirt.driver [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:54:31 np0005546954 nova_compute[187160]: 2025-12-05 12:54:31.849 187164 INFO nova.virt.libvirt.driver [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Creating image(s)#033[00m
Dec  5 07:54:31 np0005546954 nova_compute[187160]: 2025-12-05 12:54:31.850 187164 DEBUG oslo_concurrency.lockutils [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "/var/lib/nova/instances/245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:54:31 np0005546954 nova_compute[187160]: 2025-12-05 12:54:31.850 187164 DEBUG oslo_concurrency.lockutils [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "/var/lib/nova/instances/245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:54:31 np0005546954 nova_compute[187160]: 2025-12-05 12:54:31.850 187164 DEBUG oslo_concurrency.lockutils [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "/var/lib/nova/instances/245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:54:31 np0005546954 nova_compute[187160]: 2025-12-05 12:54:31.863 187164 DEBUG oslo_concurrency.processutils [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:54:31 np0005546954 nova_compute[187160]: 2025-12-05 12:54:31.954 187164 DEBUG oslo_concurrency.processutils [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:54:31 np0005546954 nova_compute[187160]: 2025-12-05 12:54:31.956 187164 DEBUG oslo_concurrency.lockutils [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:54:31 np0005546954 nova_compute[187160]: 2025-12-05 12:54:31.957 187164 DEBUG oslo_concurrency.lockutils [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:54:31 np0005546954 nova_compute[187160]: 2025-12-05 12:54:31.971 187164 DEBUG oslo_concurrency.processutils [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:54:32 np0005546954 nova_compute[187160]: 2025-12-05 12:54:32.049 187164 DEBUG oslo_concurrency.processutils [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:54:32 np0005546954 nova_compute[187160]: 2025-12-05 12:54:32.051 187164 DEBUG oslo_concurrency.processutils [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:54:32 np0005546954 nova_compute[187160]: 2025-12-05 12:54:32.685 187164 DEBUG nova.network.neutron [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Successfully created port: e78f1ad2-5aff-40db-94db-beab9e8252a9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:54:32 np0005546954 nova_compute[187160]: 2025-12-05 12:54:32.985 187164 DEBUG oslo_concurrency.processutils [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1/disk 1073741824" returned: 0 in 0.934s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:54:32 np0005546954 nova_compute[187160]: 2025-12-05 12:54:32.987 187164 DEBUG oslo_concurrency.lockutils [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 1.029s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:54:32 np0005546954 nova_compute[187160]: 2025-12-05 12:54:32.988 187164 DEBUG oslo_concurrency.processutils [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:54:33 np0005546954 nova_compute[187160]: 2025-12-05 12:54:33.082 187164 DEBUG oslo_concurrency.processutils [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:54:33 np0005546954 nova_compute[187160]: 2025-12-05 12:54:33.084 187164 DEBUG nova.virt.disk.api [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Checking if we can resize image /var/lib/nova/instances/245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:54:33 np0005546954 nova_compute[187160]: 2025-12-05 12:54:33.085 187164 DEBUG oslo_concurrency.processutils [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:54:33 np0005546954 nova_compute[187160]: 2025-12-05 12:54:33.170 187164 DEBUG oslo_concurrency.processutils [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:54:33 np0005546954 nova_compute[187160]: 2025-12-05 12:54:33.172 187164 DEBUG nova.virt.disk.api [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Cannot resize image /var/lib/nova/instances/245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:54:33 np0005546954 nova_compute[187160]: 2025-12-05 12:54:33.173 187164 DEBUG nova.objects.instance [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lazy-loading 'migration_context' on Instance uuid 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:54:33 np0005546954 nova_compute[187160]: 2025-12-05 12:54:33.189 187164 DEBUG nova.virt.libvirt.driver [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:54:33 np0005546954 nova_compute[187160]: 2025-12-05 12:54:33.190 187164 DEBUG nova.virt.libvirt.driver [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Ensure instance console log exists: /var/lib/nova/instances/245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:54:33 np0005546954 nova_compute[187160]: 2025-12-05 12:54:33.191 187164 DEBUG oslo_concurrency.lockutils [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:54:33 np0005546954 nova_compute[187160]: 2025-12-05 12:54:33.192 187164 DEBUG oslo_concurrency.lockutils [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:54:33 np0005546954 nova_compute[187160]: 2025-12-05 12:54:33.193 187164 DEBUG oslo_concurrency.lockutils [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:54:33 np0005546954 nova_compute[187160]: 2025-12-05 12:54:33.626 187164 DEBUG nova.network.neutron [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Successfully updated port: e78f1ad2-5aff-40db-94db-beab9e8252a9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:54:33 np0005546954 nova_compute[187160]: 2025-12-05 12:54:33.644 187164 DEBUG oslo_concurrency.lockutils [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "refresh_cache-245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:54:33 np0005546954 nova_compute[187160]: 2025-12-05 12:54:33.645 187164 DEBUG oslo_concurrency.lockutils [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquired lock "refresh_cache-245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:54:33 np0005546954 nova_compute[187160]: 2025-12-05 12:54:33.645 187164 DEBUG nova.network.neutron [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:54:33 np0005546954 nova_compute[187160]: 2025-12-05 12:54:33.736 187164 DEBUG nova.compute.manager [req-11b458f6-c285-4d55-ac82-4e88646df071 req-ce632a1b-6ab6-4f91-9100-2069ae48a5ee 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Received event network-changed-e78f1ad2-5aff-40db-94db-beab9e8252a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:54:33 np0005546954 nova_compute[187160]: 2025-12-05 12:54:33.736 187164 DEBUG nova.compute.manager [req-11b458f6-c285-4d55-ac82-4e88646df071 req-ce632a1b-6ab6-4f91-9100-2069ae48a5ee 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Refreshing instance network info cache due to event network-changed-e78f1ad2-5aff-40db-94db-beab9e8252a9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:54:33 np0005546954 nova_compute[187160]: 2025-12-05 12:54:33.737 187164 DEBUG oslo_concurrency.lockutils [req-11b458f6-c285-4d55-ac82-4e88646df071 req-ce632a1b-6ab6-4f91-9100-2069ae48a5ee 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "refresh_cache-245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:54:33 np0005546954 nova_compute[187160]: 2025-12-05 12:54:33.811 187164 DEBUG nova.network.neutron [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.688 187164 DEBUG nova.network.neutron [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Updating instance_info_cache with network_info: [{"id": "e78f1ad2-5aff-40db-94db-beab9e8252a9", "address": "fa:16:3e:6e:9f:53", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape78f1ad2-5a", "ovs_interfaceid": "e78f1ad2-5aff-40db-94db-beab9e8252a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.715 187164 DEBUG oslo_concurrency.lockutils [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Releasing lock "refresh_cache-245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.716 187164 DEBUG nova.compute.manager [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Instance network_info: |[{"id": "e78f1ad2-5aff-40db-94db-beab9e8252a9", "address": "fa:16:3e:6e:9f:53", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape78f1ad2-5a", "ovs_interfaceid": "e78f1ad2-5aff-40db-94db-beab9e8252a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.716 187164 DEBUG oslo_concurrency.lockutils [req-11b458f6-c285-4d55-ac82-4e88646df071 req-ce632a1b-6ab6-4f91-9100-2069ae48a5ee 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquired lock "refresh_cache-245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.717 187164 DEBUG nova.network.neutron [req-11b458f6-c285-4d55-ac82-4e88646df071 req-ce632a1b-6ab6-4f91-9100-2069ae48a5ee 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Refreshing network info cache for port e78f1ad2-5aff-40db-94db-beab9e8252a9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.720 187164 DEBUG nova.virt.libvirt.driver [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Start _get_guest_xml network_info=[{"id": "e78f1ad2-5aff-40db-94db-beab9e8252a9", "address": "fa:16:3e:6e:9f:53", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape78f1ad2-5a", "ovs_interfaceid": "e78f1ad2-5aff-40db-94db-beab9e8252a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T12:39:17Z,direct_url=<?>,disk_format='qcow2',id=f4c3125a-6fd0-40bb-aa00-a7e736ee853d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='83916c53de6f404f91206339303e1b23',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T12:39:21Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'size': 0, 'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'encrypted': False, 'image_id': 'f4c3125a-6fd0-40bb-aa00-a7e736ee853d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.726 187164 WARNING nova.virt.libvirt.driver [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.731 187164 DEBUG nova.virt.libvirt.host [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.731 187164 DEBUG nova.virt.libvirt.host [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.735 187164 DEBUG nova.virt.libvirt.host [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.735 187164 DEBUG nova.virt.libvirt.host [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.737 187164 DEBUG nova.virt.libvirt.driver [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.737 187164 DEBUG nova.virt.hardware [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T12:39:15Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b4ea63be-97f8-4a48-b000-66321c4ddb27',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T12:39:17Z,direct_url=<?>,disk_format='qcow2',id=f4c3125a-6fd0-40bb-aa00-a7e736ee853d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='83916c53de6f404f91206339303e1b23',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T12:39:21Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.737 187164 DEBUG nova.virt.hardware [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.737 187164 DEBUG nova.virt.hardware [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.738 187164 DEBUG nova.virt.hardware [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.738 187164 DEBUG nova.virt.hardware [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.738 187164 DEBUG nova.virt.hardware [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.738 187164 DEBUG nova.virt.hardware [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.738 187164 DEBUG nova.virt.hardware [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.739 187164 DEBUG nova.virt.hardware [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.739 187164 DEBUG nova.virt.hardware [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.739 187164 DEBUG nova.virt.hardware [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.744 187164 DEBUG nova.virt.libvirt.vif [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:54:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1881175617',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1881175617',id=14,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e6ae0d0dcde04b85b6dae45560cca988',ramdisk_id='',reservation_id='r-y8ivd8ap',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-192029678',owner_user_name='tempest-TestExecuteStrategies-192029678-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:54:31Z,user_data=None,user_id='0ae0bb20ac8b4be99eb1abddc7310436',uuid=245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e78f1ad2-5aff-40db-94db-beab9e8252a9", "address": "fa:16:3e:6e:9f:53", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape78f1ad2-5a", "ovs_interfaceid": "e78f1ad2-5aff-40db-94db-beab9e8252a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.744 187164 DEBUG nova.network.os_vif_util [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converting VIF {"id": "e78f1ad2-5aff-40db-94db-beab9e8252a9", "address": "fa:16:3e:6e:9f:53", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape78f1ad2-5a", "ovs_interfaceid": "e78f1ad2-5aff-40db-94db-beab9e8252a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.745 187164 DEBUG nova.network.os_vif_util [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6e:9f:53,bridge_name='br-int',has_traffic_filtering=True,id=e78f1ad2-5aff-40db-94db-beab9e8252a9,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape78f1ad2-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.747 187164 DEBUG nova.objects.instance [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lazy-loading 'pci_devices' on Instance uuid 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.766 187164 DEBUG nova.virt.libvirt.driver [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:54:34 np0005546954 nova_compute[187160]:  <uuid>245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1</uuid>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:  <name>instance-0000000e</name>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:  <memory>131072</memory>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:  <vcpu>1</vcpu>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:  <metadata>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:54:34 np0005546954 nova_compute[187160]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:      <nova:name>tempest-TestExecuteStrategies-server-1881175617</nova:name>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:      <nova:creationTime>2025-12-05 12:54:34</nova:creationTime>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:      <nova:flavor name="m1.nano">
Dec  5 07:54:34 np0005546954 nova_compute[187160]:        <nova:memory>128</nova:memory>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:        <nova:disk>1</nova:disk>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:        <nova:swap>0</nova:swap>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:      </nova:flavor>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:      <nova:owner>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:        <nova:user uuid="0ae0bb20ac8b4be99eb1abddc7310436">tempest-TestExecuteStrategies-192029678-project-member</nova:user>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:        <nova:project uuid="e6ae0d0dcde04b85b6dae45560cca988">tempest-TestExecuteStrategies-192029678</nova:project>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:      </nova:owner>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:      <nova:root type="image" uuid="f4c3125a-6fd0-40bb-aa00-a7e736ee853d"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:      <nova:ports>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:        <nova:port uuid="e78f1ad2-5aff-40db-94db-beab9e8252a9">
Dec  5 07:54:34 np0005546954 nova_compute[187160]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:        </nova:port>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:      </nova:ports>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    </nova:instance>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:  </metadata>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:  <sysinfo type="smbios">
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <system>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:      <entry name="serial">245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1</entry>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:      <entry name="uuid">245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1</entry>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    </system>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:  </sysinfo>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:  <os>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <boot dev="hd"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <smbios mode="sysinfo"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:  </os>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:  <features>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <acpi/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <apic/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <vmcoreinfo/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:  </features>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:  <clock offset="utc">
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <timer name="hpet" present="no"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:  </clock>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:  <cpu mode="custom" match="exact">
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <model>Nehalem</model>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:  </cpu>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:  <devices>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <disk type="file" device="disk">
Dec  5 07:54:34 np0005546954 nova_compute[187160]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:      <source file="/var/lib/nova/instances/245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1/disk"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:      <target dev="vda" bus="virtio"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    </disk>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <disk type="file" device="cdrom">
Dec  5 07:54:34 np0005546954 nova_compute[187160]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:      <source file="/var/lib/nova/instances/245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1/disk.config"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:      <target dev="sda" bus="sata"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    </disk>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <interface type="ethernet">
Dec  5 07:54:34 np0005546954 nova_compute[187160]:      <mac address="fa:16:3e:6e:9f:53"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:      <model type="virtio"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:      <mtu size="1442"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:      <target dev="tape78f1ad2-5a"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    </interface>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <serial type="pty">
Dec  5 07:54:34 np0005546954 nova_compute[187160]:      <log file="/var/lib/nova/instances/245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1/console.log" append="off"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    </serial>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <video>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:      <model type="virtio"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    </video>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <input type="tablet" bus="usb"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <rng model="virtio">
Dec  5 07:54:34 np0005546954 nova_compute[187160]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    </rng>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <controller type="usb" index="0"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    <memballoon model="virtio">
Dec  5 07:54:34 np0005546954 nova_compute[187160]:      <stats period="10"/>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:    </memballoon>
Dec  5 07:54:34 np0005546954 nova_compute[187160]:  </devices>
Dec  5 07:54:34 np0005546954 nova_compute[187160]: </domain>
Dec  5 07:54:34 np0005546954 nova_compute[187160]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.768 187164 DEBUG nova.compute.manager [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Preparing to wait for external event network-vif-plugged-e78f1ad2-5aff-40db-94db-beab9e8252a9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.768 187164 DEBUG oslo_concurrency.lockutils [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.768 187164 DEBUG oslo_concurrency.lockutils [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.768 187164 DEBUG oslo_concurrency.lockutils [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.769 187164 DEBUG nova.virt.libvirt.vif [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:54:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1881175617',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1881175617',id=14,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e6ae0d0dcde04b85b6dae45560cca988',ramdisk_id='',reservation_id='r-y8ivd8ap',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-192029678',owner_user_name='tempest-TestExecuteStrategies-192029678-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:54:31Z,user_data=None,user_id='0ae0bb20ac8b4be99eb1abddc7310436',uuid=245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e78f1ad2-5aff-40db-94db-beab9e8252a9", "address": "fa:16:3e:6e:9f:53", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape78f1ad2-5a", "ovs_interfaceid": "e78f1ad2-5aff-40db-94db-beab9e8252a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.769 187164 DEBUG nova.network.os_vif_util [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converting VIF {"id": "e78f1ad2-5aff-40db-94db-beab9e8252a9", "address": "fa:16:3e:6e:9f:53", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape78f1ad2-5a", "ovs_interfaceid": "e78f1ad2-5aff-40db-94db-beab9e8252a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.770 187164 DEBUG nova.network.os_vif_util [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6e:9f:53,bridge_name='br-int',has_traffic_filtering=True,id=e78f1ad2-5aff-40db-94db-beab9e8252a9,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape78f1ad2-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.770 187164 DEBUG os_vif [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6e:9f:53,bridge_name='br-int',has_traffic_filtering=True,id=e78f1ad2-5aff-40db-94db-beab9e8252a9,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape78f1ad2-5a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.771 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.772 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.772 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.777 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.778 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape78f1ad2-5a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.778 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape78f1ad2-5a, col_values=(('external_ids', {'iface-id': 'e78f1ad2-5aff-40db-94db-beab9e8252a9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6e:9f:53', 'vm-uuid': '245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.781 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.782 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:54:34 np0005546954 NetworkManager[55665]: <info>  [1764939274.7828] manager: (tape78f1ad2-5a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/54)
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.788 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.789 187164 INFO os_vif [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6e:9f:53,bridge_name='br-int',has_traffic_filtering=True,id=e78f1ad2-5aff-40db-94db-beab9e8252a9,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape78f1ad2-5a')#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.836 187164 DEBUG nova.virt.libvirt.driver [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.836 187164 DEBUG nova.virt.libvirt.driver [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.836 187164 DEBUG nova.virt.libvirt.driver [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] No VIF found with MAC fa:16:3e:6e:9f:53, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:54:34 np0005546954 nova_compute[187160]: 2025-12-05 12:54:34.837 187164 INFO nova.virt.libvirt.driver [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Using config drive#033[00m
Dec  5 07:54:35 np0005546954 nova_compute[187160]: 2025-12-05 12:54:35.268 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:54:35 np0005546954 nova_compute[187160]: 2025-12-05 12:54:35.309 187164 INFO nova.virt.libvirt.driver [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Creating config drive at /var/lib/nova/instances/245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1/disk.config#033[00m
Dec  5 07:54:35 np0005546954 nova_compute[187160]: 2025-12-05 12:54:35.313 187164 DEBUG oslo_concurrency.processutils [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmuveyapn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:54:35 np0005546954 nova_compute[187160]: 2025-12-05 12:54:35.446 187164 DEBUG oslo_concurrency.processutils [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmuveyapn" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:54:35 np0005546954 kernel: tape78f1ad2-5a: entered promiscuous mode
Dec  5 07:54:35 np0005546954 NetworkManager[55665]: <info>  [1764939275.5172] manager: (tape78f1ad2-5a): new Tun device (/org/freedesktop/NetworkManager/Devices/55)
Dec  5 07:54:35 np0005546954 ovn_controller[95566]: 2025-12-05T12:54:35Z|00133|binding|INFO|Claiming lport e78f1ad2-5aff-40db-94db-beab9e8252a9 for this chassis.
Dec  5 07:54:35 np0005546954 ovn_controller[95566]: 2025-12-05T12:54:35Z|00134|binding|INFO|e78f1ad2-5aff-40db-94db-beab9e8252a9: Claiming fa:16:3e:6e:9f:53 10.100.0.4
Dec  5 07:54:35 np0005546954 nova_compute[187160]: 2025-12-05 12:54:35.520 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:54:35.525 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6e:9f:53 10.100.0.4'], port_security=['fa:16:3e:6e:9f:53 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4389bc8-2898-48b0-9741-5183b54fe83c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e6ae0d0dcde04b85b6dae45560cca988', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9ea68f98-ae7c-4c35-bc5a-7c1a27f7e5f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb60c317-acba-4c06-b29b-f7c6c7a5660a, chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=e78f1ad2-5aff-40db-94db-beab9e8252a9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:54:35.527 104428 INFO neutron.agent.ovn.metadata.agent [-] Port e78f1ad2-5aff-40db-94db-beab9e8252a9 in datapath d4389bc8-2898-48b0-9741-5183b54fe83c bound to our chassis#033[00m
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:54:35.529 104428 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d4389bc8-2898-48b0-9741-5183b54fe83c#033[00m
Dec  5 07:54:35 np0005546954 ovn_controller[95566]: 2025-12-05T12:54:35Z|00135|binding|INFO|Setting lport e78f1ad2-5aff-40db-94db-beab9e8252a9 ovn-installed in OVS
Dec  5 07:54:35 np0005546954 ovn_controller[95566]: 2025-12-05T12:54:35Z|00136|binding|INFO|Setting lport e78f1ad2-5aff-40db-94db-beab9e8252a9 up in Southbound
Dec  5 07:54:35 np0005546954 nova_compute[187160]: 2025-12-05 12:54:35.535 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:54:35 np0005546954 nova_compute[187160]: 2025-12-05 12:54:35.539 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:54:35.545 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[e04076c1-d703-43d2-b527-f8b211565439]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:54:35.547 104428 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd4389bc8-21 in ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:54:35 np0005546954 systemd-udevd[213091]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:54:35.550 208690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd4389bc8-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:54:35.550 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[e8e2e835-147b-410d-8a40-23b310318999]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:54:35.551 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[26b8307f-ff4b-4307-b613-db321fbcf97d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:54:35 np0005546954 systemd-machined[153497]: New machine qemu-12-instance-0000000e.
Dec  5 07:54:35 np0005546954 NetworkManager[55665]: <info>  [1764939275.5656] device (tape78f1ad2-5a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:54:35.564 104542 DEBUG oslo.privsep.daemon [-] privsep: reply[86a66892-af31-489c-84e1-1b3e84bf78fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:54:35 np0005546954 NetworkManager[55665]: <info>  [1764939275.5668] device (tape78f1ad2-5a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:54:35 np0005546954 systemd[1]: Started Virtual Machine qemu-12-instance-0000000e.
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:54:35.593 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[5b3c6ede-4eba-4cd5-96f8-41f5113cca77]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:54:35.631 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[8ddd29dd-dddd-444e-a252-5ff5814979da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:54:35 np0005546954 systemd-udevd[213095]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:54:35.638 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[bd40b9ba-42a5-4c98-95e1-6bea6f5d3c10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:54:35 np0005546954 NetworkManager[55665]: <info>  [1764939275.6396] manager: (tapd4389bc8-20): new Veth device (/org/freedesktop/NetworkManager/Devices/56)
Dec  5 07:54:35 np0005546954 podman[197513]: time="2025-12-05T12:54:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:54:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:54:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 07:54:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:54:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2581 "" "Go-http-client/1.1"
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:54:35.675 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[6e1e319e-367b-42ce-9ea0-4ea0169f7386]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:54:35.679 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[8d48063f-aad6-460b-a9b6-fb901283e841]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:54:35 np0005546954 NetworkManager[55665]: <info>  [1764939275.7019] device (tapd4389bc8-20): carrier: link connected
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:54:35.709 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[2a930eb9-3841-4e39-8faa-6a4967f8b03a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:54:35.726 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[66a2476a-085a-48f2-b90b-dce31c9372cc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd4389bc8-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:43:f7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 428232, 'reachable_time': 35465, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213126, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:54:35.746 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[cf721489-3863-40a8-abfb-336e8be3fc14]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7c:43f7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 428232, 'tstamp': 428232}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213127, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:54:35.765 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[665901d4-09c9-4950-bf7d-bcc33cc0c058]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd4389bc8-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:43:f7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 428232, 'reachable_time': 35465, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213128, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:54:35.806 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[44751311-0710-4074-98bd-5d3448903aec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:54:35 np0005546954 nova_compute[187160]: 2025-12-05 12:54:35.818 187164 DEBUG nova.compute.manager [req-b4985444-ee47-4ec3-94b2-71581ef530d0 req-ec5268ce-641b-442a-8273-0b38dba679ab 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Received event network-vif-plugged-e78f1ad2-5aff-40db-94db-beab9e8252a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:54:35 np0005546954 nova_compute[187160]: 2025-12-05 12:54:35.820 187164 DEBUG oslo_concurrency.lockutils [req-b4985444-ee47-4ec3-94b2-71581ef530d0 req-ec5268ce-641b-442a-8273-0b38dba679ab 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:54:35 np0005546954 nova_compute[187160]: 2025-12-05 12:54:35.821 187164 DEBUG oslo_concurrency.lockutils [req-b4985444-ee47-4ec3-94b2-71581ef530d0 req-ec5268ce-641b-442a-8273-0b38dba679ab 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:54:35 np0005546954 nova_compute[187160]: 2025-12-05 12:54:35.822 187164 DEBUG oslo_concurrency.lockutils [req-b4985444-ee47-4ec3-94b2-71581ef530d0 req-ec5268ce-641b-442a-8273-0b38dba679ab 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:54:35 np0005546954 nova_compute[187160]: 2025-12-05 12:54:35.822 187164 DEBUG nova.compute.manager [req-b4985444-ee47-4ec3-94b2-71581ef530d0 req-ec5268ce-641b-442a-8273-0b38dba679ab 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Processing event network-vif-plugged-e78f1ad2-5aff-40db-94db-beab9e8252a9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:54:35 np0005546954 nova_compute[187160]: 2025-12-05 12:54:35.869 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764939275.8692136, 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:54:35 np0005546954 nova_compute[187160]: 2025-12-05 12:54:35.871 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] VM Started (Lifecycle Event)#033[00m
Dec  5 07:54:35 np0005546954 nova_compute[187160]: 2025-12-05 12:54:35.876 187164 DEBUG nova.compute.manager [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:54:35 np0005546954 nova_compute[187160]: 2025-12-05 12:54:35.881 187164 DEBUG nova.virt.libvirt.driver [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:54:35.882 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[76bac448-28d0-49f0-aa2d-94cfad91c5e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:54:35.886 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4389bc8-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:54:35 np0005546954 nova_compute[187160]: 2025-12-05 12:54:35.886 187164 INFO nova.virt.libvirt.driver [-] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Instance spawned successfully.#033[00m
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:54:35.887 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:54:35 np0005546954 nova_compute[187160]: 2025-12-05 12:54:35.887 187164 DEBUG nova.virt.libvirt.driver [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:54:35.889 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd4389bc8-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:54:35 np0005546954 NetworkManager[55665]: <info>  [1764939275.8938] manager: (tapd4389bc8-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/57)
Dec  5 07:54:35 np0005546954 kernel: tapd4389bc8-20: entered promiscuous mode
Dec  5 07:54:35 np0005546954 nova_compute[187160]: 2025-12-05 12:54:35.894 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:54:35 np0005546954 nova_compute[187160]: 2025-12-05 12:54:35.894 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:54:35.898 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd4389bc8-20, col_values=(('external_ids', {'iface-id': '8dbe2af5-9f18-44ca-8f22-66854bcdd596'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:54:35 np0005546954 nova_compute[187160]: 2025-12-05 12:54:35.900 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:54:35 np0005546954 ovn_controller[95566]: 2025-12-05T12:54:35Z|00137|binding|INFO|Releasing lport 8dbe2af5-9f18-44ca-8f22-66854bcdd596 from this chassis (sb_readonly=0)
Dec  5 07:54:35 np0005546954 nova_compute[187160]: 2025-12-05 12:54:35.901 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:54:35 np0005546954 nova_compute[187160]: 2025-12-05 12:54:35.903 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:54:35.904 104428 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d4389bc8-2898-48b0-9741-5183b54fe83c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d4389bc8-2898-48b0-9741-5183b54fe83c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:54:35.905 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[4b7b70d9-e20c-4d2f-89be-1f2e431e404d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:54:35.907 104428 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]: global
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]:    log         /dev/log local0 debug
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]:    log-tag     haproxy-metadata-proxy-d4389bc8-2898-48b0-9741-5183b54fe83c
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]:    user        root
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]:    group       root
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]:    maxconn     1024
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]:    pidfile     /var/lib/neutron/external/pids/d4389bc8-2898-48b0-9741-5183b54fe83c.pid.haproxy
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]:    daemon
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]: 
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]: defaults
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]:    log global
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]:    mode http
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]:    option httplog
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]:    option dontlognull
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]:    option http-server-close
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]:    option forwardfor
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]:    retries                 3
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]:    timeout http-request    30s
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]:    timeout connect         30s
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]:    timeout client          32s
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]:    timeout server          32s
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]:    timeout http-keep-alive 30s
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]: 
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]: 
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]: listen listener
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]:    bind 169.254.169.254:80
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]:    http-request add-header X-OVN-Network-ID d4389bc8-2898-48b0-9741-5183b54fe83c
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:54:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:54:35.908 104428 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'env', 'PROCESS_TAG=haproxy-d4389bc8-2898-48b0-9741-5183b54fe83c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d4389bc8-2898-48b0-9741-5183b54fe83c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:54:35 np0005546954 nova_compute[187160]: 2025-12-05 12:54:35.914 187164 DEBUG nova.virt.libvirt.driver [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:54:35 np0005546954 nova_compute[187160]: 2025-12-05 12:54:35.915 187164 DEBUG nova.virt.libvirt.driver [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:54:35 np0005546954 nova_compute[187160]: 2025-12-05 12:54:35.916 187164 DEBUG nova.virt.libvirt.driver [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:54:35 np0005546954 nova_compute[187160]: 2025-12-05 12:54:35.916 187164 DEBUG nova.virt.libvirt.driver [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:54:35 np0005546954 nova_compute[187160]: 2025-12-05 12:54:35.917 187164 DEBUG nova.virt.libvirt.driver [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:54:35 np0005546954 nova_compute[187160]: 2025-12-05 12:54:35.918 187164 DEBUG nova.virt.libvirt.driver [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:54:35 np0005546954 nova_compute[187160]: 2025-12-05 12:54:35.924 187164 DEBUG nova.network.neutron [req-11b458f6-c285-4d55-ac82-4e88646df071 req-ce632a1b-6ab6-4f91-9100-2069ae48a5ee 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Updated VIF entry in instance network info cache for port e78f1ad2-5aff-40db-94db-beab9e8252a9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:54:35 np0005546954 nova_compute[187160]: 2025-12-05 12:54:35.924 187164 DEBUG nova.network.neutron [req-11b458f6-c285-4d55-ac82-4e88646df071 req-ce632a1b-6ab6-4f91-9100-2069ae48a5ee 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Updating instance_info_cache with network_info: [{"id": "e78f1ad2-5aff-40db-94db-beab9e8252a9", "address": "fa:16:3e:6e:9f:53", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape78f1ad2-5a", "ovs_interfaceid": "e78f1ad2-5aff-40db-94db-beab9e8252a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:54:35 np0005546954 nova_compute[187160]: 2025-12-05 12:54:35.925 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:54:35 np0005546954 nova_compute[187160]: 2025-12-05 12:54:35.927 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:54:35 np0005546954 nova_compute[187160]: 2025-12-05 12:54:35.927 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764939275.8707416, 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:54:35 np0005546954 nova_compute[187160]: 2025-12-05 12:54:35.928 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:54:35 np0005546954 nova_compute[187160]: 2025-12-05 12:54:35.954 187164 DEBUG oslo_concurrency.lockutils [req-11b458f6-c285-4d55-ac82-4e88646df071 req-ce632a1b-6ab6-4f91-9100-2069ae48a5ee 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Releasing lock "refresh_cache-245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:54:35 np0005546954 nova_compute[187160]: 2025-12-05 12:54:35.982 187164 INFO nova.compute.manager [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Took 4.13 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:54:35 np0005546954 nova_compute[187160]: 2025-12-05 12:54:35.982 187164 DEBUG nova.compute.manager [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:54:35 np0005546954 nova_compute[187160]: 2025-12-05 12:54:35.991 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:54:35 np0005546954 nova_compute[187160]: 2025-12-05 12:54:35.995 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764939275.8806973, 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:54:35 np0005546954 nova_compute[187160]: 2025-12-05 12:54:35.995 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:54:36 np0005546954 nova_compute[187160]: 2025-12-05 12:54:36.039 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:54:36 np0005546954 nova_compute[187160]: 2025-12-05 12:54:36.043 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:54:36 np0005546954 nova_compute[187160]: 2025-12-05 12:54:36.073 187164 INFO nova.compute.manager [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Took 4.72 seconds to build instance.#033[00m
Dec  5 07:54:36 np0005546954 nova_compute[187160]: 2025-12-05 12:54:36.094 187164 DEBUG oslo_concurrency.lockutils [None req-b972c8d0-82a8-4f3c-9ab5-0f8756ae7751 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.819s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:54:36 np0005546954 podman[213166]: 2025-12-05 12:54:36.414947109 +0000 UTC m=+0.126757853 container create 0a58a8d3765cc73cd77ec79e561200279323cd72fd33e295677fa7a38166f26a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Dec  5 07:54:36 np0005546954 podman[213166]: 2025-12-05 12:54:36.32884293 +0000 UTC m=+0.040653724 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:54:36 np0005546954 systemd[1]: Started libpod-conmon-0a58a8d3765cc73cd77ec79e561200279323cd72fd33e295677fa7a38166f26a.scope.
Dec  5 07:54:36 np0005546954 systemd[1]: Started libcrun container.
Dec  5 07:54:36 np0005546954 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1dbddd37e847695c55a94471c9e1911e3cbeec7fe55e83877bf887f46cd71b8c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:54:36 np0005546954 podman[213166]: 2025-12-05 12:54:36.668339639 +0000 UTC m=+0.380150433 container init 0a58a8d3765cc73cd77ec79e561200279323cd72fd33e295677fa7a38166f26a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  5 07:54:36 np0005546954 podman[213166]: 2025-12-05 12:54:36.675762171 +0000 UTC m=+0.387572905 container start 0a58a8d3765cc73cd77ec79e561200279323cd72fd33e295677fa7a38166f26a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true)
Dec  5 07:54:36 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[213181]: [NOTICE]   (213185) : New worker (213187) forked
Dec  5 07:54:36 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[213181]: [NOTICE]   (213185) : Loading success.
Dec  5 07:54:37 np0005546954 nova_compute[187160]: 2025-12-05 12:54:37.902 187164 DEBUG nova.compute.manager [req-9dbbae55-47eb-41fd-9ede-2b7845be739c req-53b89659-7199-4a6f-a901-33ffc8d12437 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Received event network-vif-plugged-e78f1ad2-5aff-40db-94db-beab9e8252a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:54:37 np0005546954 nova_compute[187160]: 2025-12-05 12:54:37.903 187164 DEBUG oslo_concurrency.lockutils [req-9dbbae55-47eb-41fd-9ede-2b7845be739c req-53b89659-7199-4a6f-a901-33ffc8d12437 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:54:37 np0005546954 nova_compute[187160]: 2025-12-05 12:54:37.903 187164 DEBUG oslo_concurrency.lockutils [req-9dbbae55-47eb-41fd-9ede-2b7845be739c req-53b89659-7199-4a6f-a901-33ffc8d12437 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:54:37 np0005546954 nova_compute[187160]: 2025-12-05 12:54:37.904 187164 DEBUG oslo_concurrency.lockutils [req-9dbbae55-47eb-41fd-9ede-2b7845be739c req-53b89659-7199-4a6f-a901-33ffc8d12437 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:54:37 np0005546954 nova_compute[187160]: 2025-12-05 12:54:37.904 187164 DEBUG nova.compute.manager [req-9dbbae55-47eb-41fd-9ede-2b7845be739c req-53b89659-7199-4a6f-a901-33ffc8d12437 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] No waiting events found dispatching network-vif-plugged-e78f1ad2-5aff-40db-94db-beab9e8252a9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:54:37 np0005546954 nova_compute[187160]: 2025-12-05 12:54:37.904 187164 WARNING nova.compute.manager [req-9dbbae55-47eb-41fd-9ede-2b7845be739c req-53b89659-7199-4a6f-a901-33ffc8d12437 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Received unexpected event network-vif-plugged-e78f1ad2-5aff-40db-94db-beab9e8252a9 for instance with vm_state active and task_state None.#033[00m
Dec  5 07:54:39 np0005546954 nova_compute[187160]: 2025-12-05 12:54:39.784 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:54:40 np0005546954 nova_compute[187160]: 2025-12-05 12:54:40.321 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:54:42 np0005546954 podman[213197]: 2025-12-05 12:54:42.57324998 +0000 UTC m=+0.067020241 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.build-date=20251125, tcib_managed=true)
Dec  5 07:54:42 np0005546954 podman[213196]: 2025-12-05 12:54:42.578210236 +0000 UTC m=+0.077793449 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, version=9.6, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git)
Dec  5 07:54:44 np0005546954 nova_compute[187160]: 2025-12-05 12:54:44.831 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:54:45 np0005546954 nova_compute[187160]: 2025-12-05 12:54:45.324 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:54:49 np0005546954 ovn_controller[95566]: 2025-12-05T12:54:49Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6e:9f:53 10.100.0.4
Dec  5 07:54:49 np0005546954 ovn_controller[95566]: 2025-12-05T12:54:49Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6e:9f:53 10.100.0.4
Dec  5 07:54:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:54:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:54:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:54:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 07:54:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:54:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:54:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:54:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 07:54:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:54:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:54:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 07:54:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:54:49 np0005546954 nova_compute[187160]: 2025-12-05 12:54:49.833 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:54:50 np0005546954 nova_compute[187160]: 2025-12-05 12:54:50.327 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:54:54 np0005546954 nova_compute[187160]: 2025-12-05 12:54:54.838 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:54:55 np0005546954 nova_compute[187160]: 2025-12-05 12:54:55.362 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:54:55 np0005546954 podman[213251]: 2025-12-05 12:54:55.579416538 +0000 UTC m=+0.075497126 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  5 07:54:57 np0005546954 nova_compute[187160]: 2025-12-05 12:54:57.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:54:57 np0005546954 nova_compute[187160]: 2025-12-05 12:54:57.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  5 07:54:58 np0005546954 nova_compute[187160]: 2025-12-05 12:54:58.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:54:58 np0005546954 nova_compute[187160]: 2025-12-05 12:54:58.039 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:54:58 np0005546954 nova_compute[187160]: 2025-12-05 12:54:58.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:54:58 np0005546954 nova_compute[187160]: 2025-12-05 12:54:58.615 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "refresh_cache-245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:54:58 np0005546954 nova_compute[187160]: 2025-12-05 12:54:58.616 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquired lock "refresh_cache-245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:54:58 np0005546954 nova_compute[187160]: 2025-12-05 12:54:58.617 187164 DEBUG nova.network.neutron [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec  5 07:54:58 np0005546954 nova_compute[187160]: 2025-12-05 12:54:58.617 187164 DEBUG nova.objects.instance [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:54:59 np0005546954 nova_compute[187160]: 2025-12-05 12:54:59.793 187164 DEBUG nova.network.neutron [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Updating instance_info_cache with network_info: [{"id": "e78f1ad2-5aff-40db-94db-beab9e8252a9", "address": "fa:16:3e:6e:9f:53", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape78f1ad2-5a", "ovs_interfaceid": "e78f1ad2-5aff-40db-94db-beab9e8252a9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:54:59 np0005546954 nova_compute[187160]: 2025-12-05 12:54:59.816 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Releasing lock "refresh_cache-245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:54:59 np0005546954 nova_compute[187160]: 2025-12-05 12:54:59.816 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec  5 07:54:59 np0005546954 nova_compute[187160]: 2025-12-05 12:54:59.816 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:54:59 np0005546954 nova_compute[187160]: 2025-12-05 12:54:59.817 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:54:59 np0005546954 nova_compute[187160]: 2025-12-05 12:54:59.817 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:54:59 np0005546954 nova_compute[187160]: 2025-12-05 12:54:59.843 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:00 np0005546954 nova_compute[187160]: 2025-12-05 12:55:00.366 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:00 np0005546954 podman[213271]: 2025-12-05 12:55:00.586983811 +0000 UTC m=+0.085956834 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 07:55:00 np0005546954 podman[213270]: 2025-12-05 12:55:00.665675947 +0000 UTC m=+0.157152625 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:55:01 np0005546954 nova_compute[187160]: 2025-12-05 12:55:01.055 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:55:01 np0005546954 nova_compute[187160]: 2025-12-05 12:55:01.056 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  5 07:55:01 np0005546954 nova_compute[187160]: 2025-12-05 12:55:01.077 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  5 07:55:03 np0005546954 nova_compute[187160]: 2025-12-05 12:55:03.061 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:55:04 np0005546954 nova_compute[187160]: 2025-12-05 12:55:04.847 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:05 np0005546954 nova_compute[187160]: 2025-12-05 12:55:05.034 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:55:05 np0005546954 nova_compute[187160]: 2025-12-05 12:55:05.067 187164 DEBUG nova.virt.libvirt.driver [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 6dc1f44b-7b59-497d-b8b4-46919828df13] Creating tmpfile /var/lib/nova/instances/tmpdjqva5gp to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Dec  5 07:55:05 np0005546954 nova_compute[187160]: 2025-12-05 12:55:05.203 187164 DEBUG nova.compute.manager [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpdjqva5gp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Dec  5 07:55:05 np0005546954 nova_compute[187160]: 2025-12-05 12:55:05.370 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:05 np0005546954 ovn_controller[95566]: 2025-12-05T12:55:05Z|00138|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Dec  5 07:55:05 np0005546954 podman[197513]: time="2025-12-05T12:55:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:55:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:55:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  5 07:55:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:55:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3053 "" "Go-http-client/1.1"
Dec  5 07:55:06 np0005546954 nova_compute[187160]: 2025-12-05 12:55:06.015 187164 DEBUG nova.compute.manager [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpdjqva5gp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='6dc1f44b-7b59-497d-b8b4-46919828df13',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Dec  5 07:55:06 np0005546954 nova_compute[187160]: 2025-12-05 12:55:06.047 187164 DEBUG oslo_concurrency.lockutils [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "refresh_cache-6dc1f44b-7b59-497d-b8b4-46919828df13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:55:06 np0005546954 nova_compute[187160]: 2025-12-05 12:55:06.048 187164 DEBUG oslo_concurrency.lockutils [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquired lock "refresh_cache-6dc1f44b-7b59-497d-b8b4-46919828df13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:55:06 np0005546954 nova_compute[187160]: 2025-12-05 12:55:06.048 187164 DEBUG nova.network.neutron [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 6dc1f44b-7b59-497d-b8b4-46919828df13] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:55:07 np0005546954 nova_compute[187160]: 2025-12-05 12:55:07.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:55:07 np0005546954 nova_compute[187160]: 2025-12-05 12:55:07.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:55:07 np0005546954 nova_compute[187160]: 2025-12-05 12:55:07.075 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:55:07 np0005546954 nova_compute[187160]: 2025-12-05 12:55:07.076 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:55:07 np0005546954 nova_compute[187160]: 2025-12-05 12:55:07.076 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:55:07 np0005546954 nova_compute[187160]: 2025-12-05 12:55:07.076 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:55:07 np0005546954 nova_compute[187160]: 2025-12-05 12:55:07.146 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:55:07 np0005546954 nova_compute[187160]: 2025-12-05 12:55:07.211 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:55:07 np0005546954 nova_compute[187160]: 2025-12-05 12:55:07.213 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:55:07 np0005546954 nova_compute[187160]: 2025-12-05 12:55:07.270 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:55:07 np0005546954 nova_compute[187160]: 2025-12-05 12:55:07.445 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:55:07 np0005546954 nova_compute[187160]: 2025-12-05 12:55:07.447 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5711MB free_disk=73.3076057434082GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:55:07 np0005546954 nova_compute[187160]: 2025-12-05 12:55:07.447 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:55:07 np0005546954 nova_compute[187160]: 2025-12-05 12:55:07.447 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:55:07 np0005546954 nova_compute[187160]: 2025-12-05 12:55:07.494 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Migration for instance 6dc1f44b-7b59-497d-b8b4-46919828df13 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Dec  5 07:55:07 np0005546954 nova_compute[187160]: 2025-12-05 12:55:07.525 187164 INFO nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: 6dc1f44b-7b59-497d-b8b4-46919828df13] Updating resource usage from migration 2b32b230-be87-4497-ab36-066f086ca842#033[00m
Dec  5 07:55:07 np0005546954 nova_compute[187160]: 2025-12-05 12:55:07.526 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: 6dc1f44b-7b59-497d-b8b4-46919828df13] Starting to track incoming migration 2b32b230-be87-4497-ab36-066f086ca842 with flavor b4ea63be-97f8-4a48-b000-66321c4ddb27 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Dec  5 07:55:07 np0005546954 nova_compute[187160]: 2025-12-05 12:55:07.606 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Instance 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:55:07 np0005546954 nova_compute[187160]: 2025-12-05 12:55:07.632 187164 WARNING nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Instance 6dc1f44b-7b59-497d-b8b4-46919828df13 has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.#033[00m
Dec  5 07:55:07 np0005546954 nova_compute[187160]: 2025-12-05 12:55:07.633 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:55:07 np0005546954 nova_compute[187160]: 2025-12-05 12:55:07.633 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:55:07 np0005546954 nova_compute[187160]: 2025-12-05 12:55:07.749 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:55:07 np0005546954 nova_compute[187160]: 2025-12-05 12:55:07.770 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:55:07 np0005546954 nova_compute[187160]: 2025-12-05 12:55:07.801 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:55:07 np0005546954 nova_compute[187160]: 2025-12-05 12:55:07.801 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.354s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:55:07 np0005546954 nova_compute[187160]: 2025-12-05 12:55:07.811 187164 DEBUG nova.network.neutron [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 6dc1f44b-7b59-497d-b8b4-46919828df13] Updating instance_info_cache with network_info: [{"id": "08898a8d-58ea-4d2a-8141-17381d0867b2", "address": "fa:16:3e:a0:ad:31", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08898a8d-58", "ovs_interfaceid": "08898a8d-58ea-4d2a-8141-17381d0867b2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:55:07 np0005546954 nova_compute[187160]: 2025-12-05 12:55:07.830 187164 DEBUG oslo_concurrency.lockutils [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Releasing lock "refresh_cache-6dc1f44b-7b59-497d-b8b4-46919828df13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:55:07 np0005546954 nova_compute[187160]: 2025-12-05 12:55:07.834 187164 DEBUG nova.virt.libvirt.driver [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 6dc1f44b-7b59-497d-b8b4-46919828df13] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpdjqva5gp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='6dc1f44b-7b59-497d-b8b4-46919828df13',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Dec  5 07:55:07 np0005546954 nova_compute[187160]: 2025-12-05 12:55:07.835 187164 DEBUG nova.virt.libvirt.driver [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 6dc1f44b-7b59-497d-b8b4-46919828df13] Creating instance directory: /var/lib/nova/instances/6dc1f44b-7b59-497d-b8b4-46919828df13 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Dec  5 07:55:07 np0005546954 nova_compute[187160]: 2025-12-05 12:55:07.835 187164 DEBUG nova.virt.libvirt.driver [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 6dc1f44b-7b59-497d-b8b4-46919828df13] Creating disk.info with the contents: {'/var/lib/nova/instances/6dc1f44b-7b59-497d-b8b4-46919828df13/disk': 'qcow2', '/var/lib/nova/instances/6dc1f44b-7b59-497d-b8b4-46919828df13/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Dec  5 07:55:07 np0005546954 nova_compute[187160]: 2025-12-05 12:55:07.836 187164 DEBUG nova.virt.libvirt.driver [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 6dc1f44b-7b59-497d-b8b4-46919828df13] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Dec  5 07:55:07 np0005546954 nova_compute[187160]: 2025-12-05 12:55:07.837 187164 DEBUG nova.objects.instance [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lazy-loading 'trusted_certs' on Instance uuid 6dc1f44b-7b59-497d-b8b4-46919828df13 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:55:07 np0005546954 nova_compute[187160]: 2025-12-05 12:55:07.890 187164 DEBUG oslo_concurrency.processutils [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:55:07 np0005546954 nova_compute[187160]: 2025-12-05 12:55:07.954 187164 DEBUG oslo_concurrency.processutils [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:55:07 np0005546954 nova_compute[187160]: 2025-12-05 12:55:07.955 187164 DEBUG oslo_concurrency.lockutils [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:55:07 np0005546954 nova_compute[187160]: 2025-12-05 12:55:07.956 187164 DEBUG oslo_concurrency.lockutils [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:55:07 np0005546954 nova_compute[187160]: 2025-12-05 12:55:07.971 187164 DEBUG oslo_concurrency.processutils [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:55:08 np0005546954 nova_compute[187160]: 2025-12-05 12:55:08.030 187164 DEBUG oslo_concurrency.processutils [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:55:08 np0005546954 nova_compute[187160]: 2025-12-05 12:55:08.040 187164 DEBUG oslo_concurrency.processutils [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/6dc1f44b-7b59-497d-b8b4-46919828df13/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:55:08 np0005546954 nova_compute[187160]: 2025-12-05 12:55:08.079 187164 DEBUG oslo_concurrency.processutils [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/6dc1f44b-7b59-497d-b8b4-46919828df13/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:55:08 np0005546954 nova_compute[187160]: 2025-12-05 12:55:08.081 187164 DEBUG oslo_concurrency.lockutils [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:55:08 np0005546954 nova_compute[187160]: 2025-12-05 12:55:08.082 187164 DEBUG oslo_concurrency.processutils [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:55:08 np0005546954 nova_compute[187160]: 2025-12-05 12:55:08.142 187164 DEBUG oslo_concurrency.processutils [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:55:08 np0005546954 nova_compute[187160]: 2025-12-05 12:55:08.144 187164 DEBUG nova.virt.disk.api [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Checking if we can resize image /var/lib/nova/instances/6dc1f44b-7b59-497d-b8b4-46919828df13/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:55:08 np0005546954 nova_compute[187160]: 2025-12-05 12:55:08.144 187164 DEBUG oslo_concurrency.processutils [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6dc1f44b-7b59-497d-b8b4-46919828df13/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:55:08 np0005546954 nova_compute[187160]: 2025-12-05 12:55:08.204 187164 DEBUG oslo_concurrency.processutils [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6dc1f44b-7b59-497d-b8b4-46919828df13/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:55:08 np0005546954 nova_compute[187160]: 2025-12-05 12:55:08.205 187164 DEBUG nova.virt.disk.api [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Cannot resize image /var/lib/nova/instances/6dc1f44b-7b59-497d-b8b4-46919828df13/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:55:08 np0005546954 nova_compute[187160]: 2025-12-05 12:55:08.205 187164 DEBUG nova.objects.instance [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lazy-loading 'migration_context' on Instance uuid 6dc1f44b-7b59-497d-b8b4-46919828df13 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:55:08 np0005546954 nova_compute[187160]: 2025-12-05 12:55:08.219 187164 DEBUG oslo_concurrency.processutils [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/6dc1f44b-7b59-497d-b8b4-46919828df13/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:55:08 np0005546954 nova_compute[187160]: 2025-12-05 12:55:08.251 187164 DEBUG oslo_concurrency.processutils [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/6dc1f44b-7b59-497d-b8b4-46919828df13/disk.config 485376" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:55:08 np0005546954 nova_compute[187160]: 2025-12-05 12:55:08.253 187164 DEBUG nova.virt.libvirt.volume.remotefs [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/6dc1f44b-7b59-497d-b8b4-46919828df13/disk.config to /var/lib/nova/instances/6dc1f44b-7b59-497d-b8b4-46919828df13 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Dec  5 07:55:08 np0005546954 nova_compute[187160]: 2025-12-05 12:55:08.254 187164 DEBUG oslo_concurrency.processutils [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/6dc1f44b-7b59-497d-b8b4-46919828df13/disk.config /var/lib/nova/instances/6dc1f44b-7b59-497d-b8b4-46919828df13 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:55:08 np0005546954 nova_compute[187160]: 2025-12-05 12:55:08.603 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:55:08 np0005546954 nova_compute[187160]: 2025-12-05 12:55:08.632 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Triggering sync for uuid 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Dec  5 07:55:08 np0005546954 nova_compute[187160]: 2025-12-05 12:55:08.633 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:55:08 np0005546954 nova_compute[187160]: 2025-12-05 12:55:08.634 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:55:08 np0005546954 nova_compute[187160]: 2025-12-05 12:55:08.660 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.027s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:55:08 np0005546954 nova_compute[187160]: 2025-12-05 12:55:08.819 187164 DEBUG oslo_concurrency.processutils [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/6dc1f44b-7b59-497d-b8b4-46919828df13/disk.config /var/lib/nova/instances/6dc1f44b-7b59-497d-b8b4-46919828df13" returned: 0 in 0.565s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:55:08 np0005546954 nova_compute[187160]: 2025-12-05 12:55:08.820 187164 DEBUG nova.virt.libvirt.driver [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 6dc1f44b-7b59-497d-b8b4-46919828df13] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Dec  5 07:55:08 np0005546954 nova_compute[187160]: 2025-12-05 12:55:08.822 187164 DEBUG nova.virt.libvirt.vif [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:54:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-63740190',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-63740190',id=13,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:54:25Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e6ae0d0dcde04b85b6dae45560cca988',ramdisk_id='',reservation_id='r-9lkcn4ga',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-192029678',owner_user_name='tempest-TestExecuteStrategies-192029678-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:54:25Z,user_data=None,user_id='0ae0bb20ac8b4be99eb1abddc7310436',uuid=6dc1f44b-7b59-497d-b8b4-46919828df13,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "08898a8d-58ea-4d2a-8141-17381d0867b2", "address": "fa:16:3e:a0:ad:31", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap08898a8d-58", "ovs_interfaceid": "08898a8d-58ea-4d2a-8141-17381d0867b2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:55:08 np0005546954 nova_compute[187160]: 2025-12-05 12:55:08.822 187164 DEBUG nova.network.os_vif_util [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Converting VIF {"id": "08898a8d-58ea-4d2a-8141-17381d0867b2", "address": "fa:16:3e:a0:ad:31", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap08898a8d-58", "ovs_interfaceid": "08898a8d-58ea-4d2a-8141-17381d0867b2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:55:08 np0005546954 nova_compute[187160]: 2025-12-05 12:55:08.824 187164 DEBUG nova.network.os_vif_util [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a0:ad:31,bridge_name='br-int',has_traffic_filtering=True,id=08898a8d-58ea-4d2a-8141-17381d0867b2,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap08898a8d-58') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:55:08 np0005546954 nova_compute[187160]: 2025-12-05 12:55:08.825 187164 DEBUG os_vif [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a0:ad:31,bridge_name='br-int',has_traffic_filtering=True,id=08898a8d-58ea-4d2a-8141-17381d0867b2,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap08898a8d-58') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:55:08 np0005546954 nova_compute[187160]: 2025-12-05 12:55:08.826 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:08 np0005546954 nova_compute[187160]: 2025-12-05 12:55:08.827 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:55:08 np0005546954 nova_compute[187160]: 2025-12-05 12:55:08.828 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:55:08 np0005546954 nova_compute[187160]: 2025-12-05 12:55:08.834 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:08 np0005546954 nova_compute[187160]: 2025-12-05 12:55:08.835 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap08898a8d-58, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:55:08 np0005546954 nova_compute[187160]: 2025-12-05 12:55:08.836 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap08898a8d-58, col_values=(('external_ids', {'iface-id': '08898a8d-58ea-4d2a-8141-17381d0867b2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a0:ad:31', 'vm-uuid': '6dc1f44b-7b59-497d-b8b4-46919828df13'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:55:08 np0005546954 nova_compute[187160]: 2025-12-05 12:55:08.839 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:08 np0005546954 NetworkManager[55665]: <info>  [1764939308.8444] manager: (tap08898a8d-58): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/58)
Dec  5 07:55:08 np0005546954 nova_compute[187160]: 2025-12-05 12:55:08.845 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:55:08 np0005546954 nova_compute[187160]: 2025-12-05 12:55:08.852 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:08 np0005546954 nova_compute[187160]: 2025-12-05 12:55:08.853 187164 INFO os_vif [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a0:ad:31,bridge_name='br-int',has_traffic_filtering=True,id=08898a8d-58ea-4d2a-8141-17381d0867b2,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap08898a8d-58')#033[00m
Dec  5 07:55:08 np0005546954 nova_compute[187160]: 2025-12-05 12:55:08.854 187164 DEBUG nova.virt.libvirt.driver [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Dec  5 07:55:08 np0005546954 nova_compute[187160]: 2025-12-05 12:55:08.854 187164 DEBUG nova.compute.manager [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpdjqva5gp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='6dc1f44b-7b59-497d-b8b4-46919828df13',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Dec  5 07:55:09 np0005546954 nova_compute[187160]: 2025-12-05 12:55:09.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:55:09 np0005546954 nova_compute[187160]: 2025-12-05 12:55:09.064 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:55:09 np0005546954 nova_compute[187160]: 2025-12-05 12:55:09.064 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:55:09 np0005546954 nova_compute[187160]: 2025-12-05 12:55:09.675 187164 DEBUG nova.network.neutron [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 6dc1f44b-7b59-497d-b8b4-46919828df13] Port 08898a8d-58ea-4d2a-8141-17381d0867b2 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Dec  5 07:55:09 np0005546954 nova_compute[187160]: 2025-12-05 12:55:09.677 187164 DEBUG nova.compute.manager [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpdjqva5gp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='6dc1f44b-7b59-497d-b8b4-46919828df13',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Dec  5 07:55:09 np0005546954 kernel: tap08898a8d-58: entered promiscuous mode
Dec  5 07:55:09 np0005546954 NetworkManager[55665]: <info>  [1764939309.9443] manager: (tap08898a8d-58): new Tun device (/org/freedesktop/NetworkManager/Devices/59)
Dec  5 07:55:09 np0005546954 ovn_controller[95566]: 2025-12-05T12:55:09Z|00139|binding|INFO|Claiming lport 08898a8d-58ea-4d2a-8141-17381d0867b2 for this additional chassis.
Dec  5 07:55:09 np0005546954 ovn_controller[95566]: 2025-12-05T12:55:09Z|00140|binding|INFO|08898a8d-58ea-4d2a-8141-17381d0867b2: Claiming fa:16:3e:a0:ad:31 10.100.0.3
Dec  5 07:55:09 np0005546954 nova_compute[187160]: 2025-12-05 12:55:09.947 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:09 np0005546954 ovn_controller[95566]: 2025-12-05T12:55:09Z|00141|binding|INFO|Setting lport 08898a8d-58ea-4d2a-8141-17381d0867b2 ovn-installed in OVS
Dec  5 07:55:09 np0005546954 nova_compute[187160]: 2025-12-05 12:55:09.962 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:09 np0005546954 nova_compute[187160]: 2025-12-05 12:55:09.964 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:09 np0005546954 systemd-udevd[213360]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:55:09 np0005546954 NetworkManager[55665]: <info>  [1764939309.9939] device (tap08898a8d-58): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:55:09 np0005546954 systemd-machined[153497]: New machine qemu-13-instance-0000000d.
Dec  5 07:55:09 np0005546954 NetworkManager[55665]: <info>  [1764939309.9948] device (tap08898a8d-58): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:55:10 np0005546954 systemd[1]: Started Virtual Machine qemu-13-instance-0000000d.
Dec  5 07:55:10 np0005546954 nova_compute[187160]: 2025-12-05 12:55:10.413 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:11 np0005546954 nova_compute[187160]: 2025-12-05 12:55:11.031 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764939311.0302238, 6dc1f44b-7b59-497d-b8b4-46919828df13 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:55:11 np0005546954 nova_compute[187160]: 2025-12-05 12:55:11.032 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 6dc1f44b-7b59-497d-b8b4-46919828df13] VM Started (Lifecycle Event)#033[00m
Dec  5 07:55:11 np0005546954 nova_compute[187160]: 2025-12-05 12:55:11.058 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 6dc1f44b-7b59-497d-b8b4-46919828df13] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:55:12 np0005546954 nova_compute[187160]: 2025-12-05 12:55:12.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:55:12 np0005546954 nova_compute[187160]: 2025-12-05 12:55:12.042 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764939312.0419872, 6dc1f44b-7b59-497d-b8b4-46919828df13 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:55:12 np0005546954 nova_compute[187160]: 2025-12-05 12:55:12.043 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 6dc1f44b-7b59-497d-b8b4-46919828df13] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:55:12 np0005546954 nova_compute[187160]: 2025-12-05 12:55:12.073 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 6dc1f44b-7b59-497d-b8b4-46919828df13] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:55:12 np0005546954 nova_compute[187160]: 2025-12-05 12:55:12.079 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 6dc1f44b-7b59-497d-b8b4-46919828df13] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:55:12 np0005546954 nova_compute[187160]: 2025-12-05 12:55:12.112 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 6dc1f44b-7b59-497d-b8b4-46919828df13] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Dec  5 07:55:13 np0005546954 podman[213394]: 2025-12-05 12:55:13.592236284 +0000 UTC m=+0.099671314 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:55:13 np0005546954 podman[213393]: 2025-12-05 12:55:13.595523058 +0000 UTC m=+0.106157298 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-type=git, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec  5 07:55:13 np0005546954 nova_compute[187160]: 2025-12-05 12:55:13.839 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:13 np0005546954 ovn_controller[95566]: 2025-12-05T12:55:13Z|00142|binding|INFO|Claiming lport 08898a8d-58ea-4d2a-8141-17381d0867b2 for this chassis.
Dec  5 07:55:13 np0005546954 ovn_controller[95566]: 2025-12-05T12:55:13Z|00143|binding|INFO|08898a8d-58ea-4d2a-8141-17381d0867b2: Claiming fa:16:3e:a0:ad:31 10.100.0.3
Dec  5 07:55:13 np0005546954 ovn_controller[95566]: 2025-12-05T12:55:13Z|00144|binding|INFO|Setting lport 08898a8d-58ea-4d2a-8141-17381d0867b2 up in Southbound
Dec  5 07:55:13 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:13.925 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a0:ad:31 10.100.0.3'], port_security=['fa:16:3e:a0:ad:31 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '6dc1f44b-7b59-497d-b8b4-46919828df13', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4389bc8-2898-48b0-9741-5183b54fe83c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e6ae0d0dcde04b85b6dae45560cca988', 'neutron:revision_number': '11', 'neutron:security_group_ids': '9ea68f98-ae7c-4c35-bc5a-7c1a27f7e5f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb60c317-acba-4c06-b29b-f7c6c7a5660a, chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=08898a8d-58ea-4d2a-8141-17381d0867b2) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:55:13 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:13.927 104428 INFO neutron.agent.ovn.metadata.agent [-] Port 08898a8d-58ea-4d2a-8141-17381d0867b2 in datapath d4389bc8-2898-48b0-9741-5183b54fe83c bound to our chassis#033[00m
Dec  5 07:55:13 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:13.929 104428 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d4389bc8-2898-48b0-9741-5183b54fe83c#033[00m
Dec  5 07:55:13 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:13.952 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[381ad8c2-04f8-40a1-bb53-88d885970f75]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:55:13 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:13.990 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[8413cbbf-a714-4c40-802f-b26114a2e7f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:55:13 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:13.994 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[ab0837ad-d891-4f9b-b685-dd4bcb13d933]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:55:14 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:14.025 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[4296bef5-3983-4a06-8b72-5870bfebd22b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:55:14 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:14.042 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[0985a6f8-2b06-47db-a190-a4412f87a783]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd4389bc8-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:43:f7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 428232, 'reachable_time': 35465, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213439, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:55:14 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:14.059 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[b173cdf3-6385-47e8-b90f-de47c110523f]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd4389bc8-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 428246, 'tstamp': 428246}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213440, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd4389bc8-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 428249, 'tstamp': 428249}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213440, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:55:14 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:14.061 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4389bc8-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:55:14 np0005546954 nova_compute[187160]: 2025-12-05 12:55:14.064 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:14 np0005546954 nova_compute[187160]: 2025-12-05 12:55:14.065 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:14 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:14.066 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd4389bc8-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:55:14 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:14.067 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:55:14 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:14.067 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd4389bc8-20, col_values=(('external_ids', {'iface-id': '8dbe2af5-9f18-44ca-8f22-66854bcdd596'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:55:14 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:14.068 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:55:14 np0005546954 nova_compute[187160]: 2025-12-05 12:55:14.766 187164 INFO nova.compute.manager [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 6dc1f44b-7b59-497d-b8b4-46919828df13] Post operation of migration started#033[00m
Dec  5 07:55:15 np0005546954 nova_compute[187160]: 2025-12-05 12:55:15.416 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:15 np0005546954 nova_compute[187160]: 2025-12-05 12:55:15.643 187164 DEBUG oslo_concurrency.lockutils [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "refresh_cache-6dc1f44b-7b59-497d-b8b4-46919828df13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:55:15 np0005546954 nova_compute[187160]: 2025-12-05 12:55:15.644 187164 DEBUG oslo_concurrency.lockutils [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquired lock "refresh_cache-6dc1f44b-7b59-497d-b8b4-46919828df13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:55:15 np0005546954 nova_compute[187160]: 2025-12-05 12:55:15.644 187164 DEBUG nova.network.neutron [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 6dc1f44b-7b59-497d-b8b4-46919828df13] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:55:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:16.954 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:55:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:16.956 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:55:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:16.957 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:55:17 np0005546954 nova_compute[187160]: 2025-12-05 12:55:17.639 187164 DEBUG nova.network.neutron [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 6dc1f44b-7b59-497d-b8b4-46919828df13] Updating instance_info_cache with network_info: [{"id": "08898a8d-58ea-4d2a-8141-17381d0867b2", "address": "fa:16:3e:a0:ad:31", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08898a8d-58", "ovs_interfaceid": "08898a8d-58ea-4d2a-8141-17381d0867b2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:55:17 np0005546954 nova_compute[187160]: 2025-12-05 12:55:17.663 187164 DEBUG oslo_concurrency.lockutils [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Releasing lock "refresh_cache-6dc1f44b-7b59-497d-b8b4-46919828df13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:55:17 np0005546954 nova_compute[187160]: 2025-12-05 12:55:17.683 187164 DEBUG oslo_concurrency.lockutils [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:55:17 np0005546954 nova_compute[187160]: 2025-12-05 12:55:17.684 187164 DEBUG oslo_concurrency.lockutils [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:55:17 np0005546954 nova_compute[187160]: 2025-12-05 12:55:17.684 187164 DEBUG oslo_concurrency.lockutils [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:55:17 np0005546954 nova_compute[187160]: 2025-12-05 12:55:17.692 187164 INFO nova.virt.libvirt.driver [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 6dc1f44b-7b59-497d-b8b4-46919828df13] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Dec  5 07:55:17 np0005546954 virtqemud[186730]: Domain id=13 name='instance-0000000d' uuid=6dc1f44b-7b59-497d-b8b4-46919828df13 is tainted: custom-monitor
Dec  5 07:55:18 np0005546954 nova_compute[187160]: 2025-12-05 12:55:18.704 187164 INFO nova.virt.libvirt.driver [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 6dc1f44b-7b59-497d-b8b4-46919828df13] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Dec  5 07:55:18 np0005546954 nova_compute[187160]: 2025-12-05 12:55:18.842 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:55:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 07:55:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:55:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:55:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:55:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:55:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:55:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 07:55:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:55:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:55:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 07:55:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:55:19 np0005546954 nova_compute[187160]: 2025-12-05 12:55:19.711 187164 INFO nova.virt.libvirt.driver [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 6dc1f44b-7b59-497d-b8b4-46919828df13] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Dec  5 07:55:19 np0005546954 nova_compute[187160]: 2025-12-05 12:55:19.716 187164 DEBUG nova.compute.manager [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 6dc1f44b-7b59-497d-b8b4-46919828df13] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:55:19 np0005546954 nova_compute[187160]: 2025-12-05 12:55:19.737 187164 DEBUG nova.objects.instance [None req-4eb43517-3c1c-4510-92d5-2ba690333bcd 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 6dc1f44b-7b59-497d-b8b4-46919828df13] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Dec  5 07:55:20 np0005546954 nova_compute[187160]: 2025-12-05 12:55:20.454 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:23 np0005546954 nova_compute[187160]: 2025-12-05 12:55:23.845 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:24 np0005546954 nova_compute[187160]: 2025-12-05 12:55:24.928 187164 DEBUG oslo_concurrency.lockutils [None req-3ba3e899-a178-48ad-8d7e-2327f46dd4fa 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:55:24 np0005546954 nova_compute[187160]: 2025-12-05 12:55:24.929 187164 DEBUG oslo_concurrency.lockutils [None req-3ba3e899-a178-48ad-8d7e-2327f46dd4fa 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:55:24 np0005546954 nova_compute[187160]: 2025-12-05 12:55:24.929 187164 DEBUG oslo_concurrency.lockutils [None req-3ba3e899-a178-48ad-8d7e-2327f46dd4fa 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:55:24 np0005546954 nova_compute[187160]: 2025-12-05 12:55:24.929 187164 DEBUG oslo_concurrency.lockutils [None req-3ba3e899-a178-48ad-8d7e-2327f46dd4fa 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:55:24 np0005546954 nova_compute[187160]: 2025-12-05 12:55:24.930 187164 DEBUG oslo_concurrency.lockutils [None req-3ba3e899-a178-48ad-8d7e-2327f46dd4fa 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:55:24 np0005546954 nova_compute[187160]: 2025-12-05 12:55:24.931 187164 INFO nova.compute.manager [None req-3ba3e899-a178-48ad-8d7e-2327f46dd4fa 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Terminating instance#033[00m
Dec  5 07:55:24 np0005546954 nova_compute[187160]: 2025-12-05 12:55:24.932 187164 DEBUG nova.compute.manager [None req-3ba3e899-a178-48ad-8d7e-2327f46dd4fa 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:55:24 np0005546954 kernel: tape78f1ad2-5a (unregistering): left promiscuous mode
Dec  5 07:55:24 np0005546954 NetworkManager[55665]: <info>  [1764939324.9590] device (tape78f1ad2-5a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:55:24 np0005546954 nova_compute[187160]: 2025-12-05 12:55:24.967 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:24 np0005546954 ovn_controller[95566]: 2025-12-05T12:55:24Z|00145|binding|INFO|Releasing lport e78f1ad2-5aff-40db-94db-beab9e8252a9 from this chassis (sb_readonly=0)
Dec  5 07:55:24 np0005546954 ovn_controller[95566]: 2025-12-05T12:55:24Z|00146|binding|INFO|Setting lport e78f1ad2-5aff-40db-94db-beab9e8252a9 down in Southbound
Dec  5 07:55:24 np0005546954 ovn_controller[95566]: 2025-12-05T12:55:24Z|00147|binding|INFO|Removing iface tape78f1ad2-5a ovn-installed in OVS
Dec  5 07:55:24 np0005546954 nova_compute[187160]: 2025-12-05 12:55:24.971 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:24 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:24.983 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6e:9f:53 10.100.0.4'], port_security=['fa:16:3e:6e:9f:53 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4389bc8-2898-48b0-9741-5183b54fe83c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e6ae0d0dcde04b85b6dae45560cca988', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9ea68f98-ae7c-4c35-bc5a-7c1a27f7e5f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb60c317-acba-4c06-b29b-f7c6c7a5660a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=e78f1ad2-5aff-40db-94db-beab9e8252a9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:55:24 np0005546954 nova_compute[187160]: 2025-12-05 12:55:24.984 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:24 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:24.986 104428 INFO neutron.agent.ovn.metadata.agent [-] Port e78f1ad2-5aff-40db-94db-beab9e8252a9 in datapath d4389bc8-2898-48b0-9741-5183b54fe83c unbound from our chassis#033[00m
Dec  5 07:55:24 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:24.988 104428 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d4389bc8-2898-48b0-9741-5183b54fe83c#033[00m
Dec  5 07:55:25 np0005546954 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000e.scope: Deactivated successfully.
Dec  5 07:55:25 np0005546954 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000e.scope: Consumed 14.217s CPU time.
Dec  5 07:55:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:25.008 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[98de4fdf-96e2-4169-95ce-c1eb47e8e0ca]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:55:25 np0005546954 systemd-machined[153497]: Machine qemu-12-instance-0000000e terminated.
Dec  5 07:55:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:25.034 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[5f8d7415-2700-49c8-94ef-12f9574031a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:55:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:25.039 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[492342ee-e7a5-498c-ad12-4b6b2a70e0d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:55:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:25.060 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[d850797d-7f9e-42c7-988e-c7c749ed3419]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:55:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:25.076 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[cb7371ea-2899-4ba2-be0e-af9da7191279]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd4389bc8-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:43:f7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 428232, 'reachable_time': 35465, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213453, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:55:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:25.092 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[4555ba16-25cb-44fb-89f8-084f54ab6fdb]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd4389bc8-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 428246, 'tstamp': 428246}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213454, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd4389bc8-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 428249, 'tstamp': 428249}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213454, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:55:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:25.094 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4389bc8-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:55:25 np0005546954 nova_compute[187160]: 2025-12-05 12:55:25.095 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:25 np0005546954 nova_compute[187160]: 2025-12-05 12:55:25.100 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:25.101 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd4389bc8-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:55:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:25.101 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:55:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:25.101 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd4389bc8-20, col_values=(('external_ids', {'iface-id': '8dbe2af5-9f18-44ca-8f22-66854bcdd596'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:55:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:25.102 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:55:25 np0005546954 nova_compute[187160]: 2025-12-05 12:55:25.206 187164 INFO nova.virt.libvirt.driver [-] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Instance destroyed successfully.#033[00m
Dec  5 07:55:25 np0005546954 nova_compute[187160]: 2025-12-05 12:55:25.208 187164 DEBUG nova.objects.instance [None req-3ba3e899-a178-48ad-8d7e-2327f46dd4fa 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lazy-loading 'resources' on Instance uuid 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:55:25 np0005546954 nova_compute[187160]: 2025-12-05 12:55:25.222 187164 DEBUG nova.virt.libvirt.vif [None req-3ba3e899-a178-48ad-8d7e-2327f46dd4fa 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:54:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1881175617',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1881175617',id=14,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:54:35Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e6ae0d0dcde04b85b6dae45560cca988',ramdisk_id='',reservation_id='r-y8ivd8ap',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-192029678',owner_user_name='tempest-TestExecuteStrategies-192029678-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:54:36Z,user_data=None,user_id='0ae0bb20ac8b4be99eb1abddc7310436',uuid=245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e78f1ad2-5aff-40db-94db-beab9e8252a9", "address": "fa:16:3e:6e:9f:53", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape78f1ad2-5a", "ovs_interfaceid": "e78f1ad2-5aff-40db-94db-beab9e8252a9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:55:25 np0005546954 nova_compute[187160]: 2025-12-05 12:55:25.222 187164 DEBUG nova.network.os_vif_util [None req-3ba3e899-a178-48ad-8d7e-2327f46dd4fa 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converting VIF {"id": "e78f1ad2-5aff-40db-94db-beab9e8252a9", "address": "fa:16:3e:6e:9f:53", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape78f1ad2-5a", "ovs_interfaceid": "e78f1ad2-5aff-40db-94db-beab9e8252a9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:55:25 np0005546954 nova_compute[187160]: 2025-12-05 12:55:25.223 187164 DEBUG nova.network.os_vif_util [None req-3ba3e899-a178-48ad-8d7e-2327f46dd4fa 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6e:9f:53,bridge_name='br-int',has_traffic_filtering=True,id=e78f1ad2-5aff-40db-94db-beab9e8252a9,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape78f1ad2-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:55:25 np0005546954 nova_compute[187160]: 2025-12-05 12:55:25.223 187164 DEBUG os_vif [None req-3ba3e899-a178-48ad-8d7e-2327f46dd4fa 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6e:9f:53,bridge_name='br-int',has_traffic_filtering=True,id=e78f1ad2-5aff-40db-94db-beab9e8252a9,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape78f1ad2-5a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:55:25 np0005546954 nova_compute[187160]: 2025-12-05 12:55:25.227 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:25 np0005546954 nova_compute[187160]: 2025-12-05 12:55:25.227 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape78f1ad2-5a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:55:25 np0005546954 nova_compute[187160]: 2025-12-05 12:55:25.229 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:25 np0005546954 nova_compute[187160]: 2025-12-05 12:55:25.232 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:25 np0005546954 nova_compute[187160]: 2025-12-05 12:55:25.235 187164 INFO os_vif [None req-3ba3e899-a178-48ad-8d7e-2327f46dd4fa 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6e:9f:53,bridge_name='br-int',has_traffic_filtering=True,id=e78f1ad2-5aff-40db-94db-beab9e8252a9,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape78f1ad2-5a')#033[00m
Dec  5 07:55:25 np0005546954 nova_compute[187160]: 2025-12-05 12:55:25.236 187164 INFO nova.virt.libvirt.driver [None req-3ba3e899-a178-48ad-8d7e-2327f46dd4fa 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Deleting instance files /var/lib/nova/instances/245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1_del#033[00m
Dec  5 07:55:25 np0005546954 nova_compute[187160]: 2025-12-05 12:55:25.237 187164 INFO nova.virt.libvirt.driver [None req-3ba3e899-a178-48ad-8d7e-2327f46dd4fa 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Deletion of /var/lib/nova/instances/245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1_del complete#033[00m
Dec  5 07:55:25 np0005546954 nova_compute[187160]: 2025-12-05 12:55:25.284 187164 INFO nova.compute.manager [None req-3ba3e899-a178-48ad-8d7e-2327f46dd4fa 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Took 0.35 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:55:25 np0005546954 nova_compute[187160]: 2025-12-05 12:55:25.285 187164 DEBUG oslo.service.loopingcall [None req-3ba3e899-a178-48ad-8d7e-2327f46dd4fa 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:55:25 np0005546954 nova_compute[187160]: 2025-12-05 12:55:25.285 187164 DEBUG nova.compute.manager [-] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:55:25 np0005546954 nova_compute[187160]: 2025-12-05 12:55:25.285 187164 DEBUG nova.network.neutron [-] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:55:25 np0005546954 nova_compute[187160]: 2025-12-05 12:55:25.457 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:25 np0005546954 nova_compute[187160]: 2025-12-05 12:55:25.877 187164 DEBUG nova.compute.manager [req-53a7e220-b3b9-456b-91d5-04036f98fa10 req-1a8db843-9aaf-4e16-a8a0-3092c37c9db3 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Received event network-vif-unplugged-e78f1ad2-5aff-40db-94db-beab9e8252a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:55:25 np0005546954 nova_compute[187160]: 2025-12-05 12:55:25.878 187164 DEBUG oslo_concurrency.lockutils [req-53a7e220-b3b9-456b-91d5-04036f98fa10 req-1a8db843-9aaf-4e16-a8a0-3092c37c9db3 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:55:25 np0005546954 nova_compute[187160]: 2025-12-05 12:55:25.878 187164 DEBUG oslo_concurrency.lockutils [req-53a7e220-b3b9-456b-91d5-04036f98fa10 req-1a8db843-9aaf-4e16-a8a0-3092c37c9db3 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:55:25 np0005546954 nova_compute[187160]: 2025-12-05 12:55:25.878 187164 DEBUG oslo_concurrency.lockutils [req-53a7e220-b3b9-456b-91d5-04036f98fa10 req-1a8db843-9aaf-4e16-a8a0-3092c37c9db3 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:55:25 np0005546954 nova_compute[187160]: 2025-12-05 12:55:25.878 187164 DEBUG nova.compute.manager [req-53a7e220-b3b9-456b-91d5-04036f98fa10 req-1a8db843-9aaf-4e16-a8a0-3092c37c9db3 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] No waiting events found dispatching network-vif-unplugged-e78f1ad2-5aff-40db-94db-beab9e8252a9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:55:25 np0005546954 nova_compute[187160]: 2025-12-05 12:55:25.878 187164 DEBUG nova.compute.manager [req-53a7e220-b3b9-456b-91d5-04036f98fa10 req-1a8db843-9aaf-4e16-a8a0-3092c37c9db3 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Received event network-vif-unplugged-e78f1ad2-5aff-40db-94db-beab9e8252a9 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  5 07:55:26 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:26.118 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2a:56:46', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '32:90:88:ab:74:32'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:55:26 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:26.119 104428 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 07:55:26 np0005546954 nova_compute[187160]: 2025-12-05 12:55:26.120 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:26 np0005546954 nova_compute[187160]: 2025-12-05 12:55:26.160 187164 DEBUG nova.network.neutron [-] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:55:26 np0005546954 nova_compute[187160]: 2025-12-05 12:55:26.176 187164 INFO nova.compute.manager [-] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Took 0.89 seconds to deallocate network for instance.#033[00m
Dec  5 07:55:26 np0005546954 nova_compute[187160]: 2025-12-05 12:55:26.223 187164 DEBUG oslo_concurrency.lockutils [None req-3ba3e899-a178-48ad-8d7e-2327f46dd4fa 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:55:26 np0005546954 nova_compute[187160]: 2025-12-05 12:55:26.223 187164 DEBUG oslo_concurrency.lockutils [None req-3ba3e899-a178-48ad-8d7e-2327f46dd4fa 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:55:26 np0005546954 nova_compute[187160]: 2025-12-05 12:55:26.308 187164 DEBUG nova.compute.provider_tree [None req-3ba3e899-a178-48ad-8d7e-2327f46dd4fa 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:55:26 np0005546954 nova_compute[187160]: 2025-12-05 12:55:26.344 187164 DEBUG nova.scheduler.client.report [None req-3ba3e899-a178-48ad-8d7e-2327f46dd4fa 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:55:26 np0005546954 nova_compute[187160]: 2025-12-05 12:55:26.369 187164 DEBUG oslo_concurrency.lockutils [None req-3ba3e899-a178-48ad-8d7e-2327f46dd4fa 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:55:26 np0005546954 nova_compute[187160]: 2025-12-05 12:55:26.404 187164 INFO nova.scheduler.client.report [None req-3ba3e899-a178-48ad-8d7e-2327f46dd4fa 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Deleted allocations for instance 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1#033[00m
Dec  5 07:55:26 np0005546954 nova_compute[187160]: 2025-12-05 12:55:26.467 187164 DEBUG oslo_concurrency.lockutils [None req-3ba3e899-a178-48ad-8d7e-2327f46dd4fa 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.539s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:55:26 np0005546954 podman[213473]: 2025-12-05 12:55:26.575019192 +0000 UTC m=+0.079668987 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec  5 07:55:27 np0005546954 nova_compute[187160]: 2025-12-05 12:55:27.249 187164 DEBUG oslo_concurrency.lockutils [None req-73882457-4df5-4064-a00c-a872bd74313a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "6dc1f44b-7b59-497d-b8b4-46919828df13" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:55:27 np0005546954 nova_compute[187160]: 2025-12-05 12:55:27.249 187164 DEBUG oslo_concurrency.lockutils [None req-73882457-4df5-4064-a00c-a872bd74313a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "6dc1f44b-7b59-497d-b8b4-46919828df13" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:55:27 np0005546954 nova_compute[187160]: 2025-12-05 12:55:27.250 187164 DEBUG oslo_concurrency.lockutils [None req-73882457-4df5-4064-a00c-a872bd74313a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "6dc1f44b-7b59-497d-b8b4-46919828df13-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:55:27 np0005546954 nova_compute[187160]: 2025-12-05 12:55:27.250 187164 DEBUG oslo_concurrency.lockutils [None req-73882457-4df5-4064-a00c-a872bd74313a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "6dc1f44b-7b59-497d-b8b4-46919828df13-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:55:27 np0005546954 nova_compute[187160]: 2025-12-05 12:55:27.250 187164 DEBUG oslo_concurrency.lockutils [None req-73882457-4df5-4064-a00c-a872bd74313a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "6dc1f44b-7b59-497d-b8b4-46919828df13-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:55:27 np0005546954 nova_compute[187160]: 2025-12-05 12:55:27.251 187164 INFO nova.compute.manager [None req-73882457-4df5-4064-a00c-a872bd74313a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 6dc1f44b-7b59-497d-b8b4-46919828df13] Terminating instance#033[00m
Dec  5 07:55:27 np0005546954 nova_compute[187160]: 2025-12-05 12:55:27.251 187164 DEBUG nova.compute.manager [None req-73882457-4df5-4064-a00c-a872bd74313a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 6dc1f44b-7b59-497d-b8b4-46919828df13] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:55:27 np0005546954 kernel: tap08898a8d-58 (unregistering): left promiscuous mode
Dec  5 07:55:27 np0005546954 NetworkManager[55665]: <info>  [1764939327.2720] device (tap08898a8d-58): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:55:27 np0005546954 ovn_controller[95566]: 2025-12-05T12:55:27Z|00148|binding|INFO|Releasing lport 08898a8d-58ea-4d2a-8141-17381d0867b2 from this chassis (sb_readonly=0)
Dec  5 07:55:27 np0005546954 ovn_controller[95566]: 2025-12-05T12:55:27Z|00149|binding|INFO|Setting lport 08898a8d-58ea-4d2a-8141-17381d0867b2 down in Southbound
Dec  5 07:55:27 np0005546954 nova_compute[187160]: 2025-12-05 12:55:27.282 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:27 np0005546954 ovn_controller[95566]: 2025-12-05T12:55:27Z|00150|binding|INFO|Removing iface tap08898a8d-58 ovn-installed in OVS
Dec  5 07:55:27 np0005546954 nova_compute[187160]: 2025-12-05 12:55:27.283 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:27.300 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a0:ad:31 10.100.0.3'], port_security=['fa:16:3e:a0:ad:31 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '6dc1f44b-7b59-497d-b8b4-46919828df13', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4389bc8-2898-48b0-9741-5183b54fe83c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e6ae0d0dcde04b85b6dae45560cca988', 'neutron:revision_number': '11', 'neutron:security_group_ids': '9ea68f98-ae7c-4c35-bc5a-7c1a27f7e5f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb60c317-acba-4c06-b29b-f7c6c7a5660a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=08898a8d-58ea-4d2a-8141-17381d0867b2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:55:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:27.301 104428 INFO neutron.agent.ovn.metadata.agent [-] Port 08898a8d-58ea-4d2a-8141-17381d0867b2 in datapath d4389bc8-2898-48b0-9741-5183b54fe83c unbound from our chassis#033[00m
Dec  5 07:55:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:27.303 104428 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d4389bc8-2898-48b0-9741-5183b54fe83c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:55:27 np0005546954 nova_compute[187160]: 2025-12-05 12:55:27.305 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:27.305 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[4c2bce05-4397-44d4-b124-9ea8630ab0ae]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:55:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:27.306 104428 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c namespace which is not needed anymore#033[00m
Dec  5 07:55:27 np0005546954 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Dec  5 07:55:27 np0005546954 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000000d.scope: Consumed 2.229s CPU time.
Dec  5 07:55:27 np0005546954 systemd-machined[153497]: Machine qemu-13-instance-0000000d terminated.
Dec  5 07:55:27 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[213181]: [NOTICE]   (213185) : haproxy version is 2.8.14-c23fe91
Dec  5 07:55:27 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[213181]: [NOTICE]   (213185) : path to executable is /usr/sbin/haproxy
Dec  5 07:55:27 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[213181]: [WARNING]  (213185) : Exiting Master process...
Dec  5 07:55:27 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[213181]: [ALERT]    (213185) : Current worker (213187) exited with code 143 (Terminated)
Dec  5 07:55:27 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[213181]: [WARNING]  (213185) : All workers exited. Exiting... (0)
Dec  5 07:55:27 np0005546954 systemd[1]: libpod-0a58a8d3765cc73cd77ec79e561200279323cd72fd33e295677fa7a38166f26a.scope: Deactivated successfully.
Dec  5 07:55:27 np0005546954 podman[213515]: 2025-12-05 12:55:27.467472218 +0000 UTC m=+0.050603347 container died 0a58a8d3765cc73cd77ec79e561200279323cd72fd33e295677fa7a38166f26a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec  5 07:55:27 np0005546954 NetworkManager[55665]: <info>  [1764939327.4713] manager: (tap08898a8d-58): new Tun device (/org/freedesktop/NetworkManager/Devices/60)
Dec  5 07:55:27 np0005546954 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0a58a8d3765cc73cd77ec79e561200279323cd72fd33e295677fa7a38166f26a-userdata-shm.mount: Deactivated successfully.
Dec  5 07:55:27 np0005546954 systemd[1]: var-lib-containers-storage-overlay-1dbddd37e847695c55a94471c9e1911e3cbeec7fe55e83877bf887f46cd71b8c-merged.mount: Deactivated successfully.
Dec  5 07:55:27 np0005546954 podman[213515]: 2025-12-05 12:55:27.508968748 +0000 UTC m=+0.092099877 container cleanup 0a58a8d3765cc73cd77ec79e561200279323cd72fd33e295677fa7a38166f26a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:55:27 np0005546954 systemd[1]: libpod-conmon-0a58a8d3765cc73cd77ec79e561200279323cd72fd33e295677fa7a38166f26a.scope: Deactivated successfully.
Dec  5 07:55:27 np0005546954 nova_compute[187160]: 2025-12-05 12:55:27.526 187164 INFO nova.virt.libvirt.driver [-] [instance: 6dc1f44b-7b59-497d-b8b4-46919828df13] Instance destroyed successfully.#033[00m
Dec  5 07:55:27 np0005546954 nova_compute[187160]: 2025-12-05 12:55:27.526 187164 DEBUG nova.objects.instance [None req-73882457-4df5-4064-a00c-a872bd74313a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lazy-loading 'resources' on Instance uuid 6dc1f44b-7b59-497d-b8b4-46919828df13 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:55:27 np0005546954 nova_compute[187160]: 2025-12-05 12:55:27.543 187164 DEBUG nova.virt.libvirt.vif [None req-73882457-4df5-4064-a00c-a872bd74313a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T12:54:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-63740190',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-63740190',id=13,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:54:25Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e6ae0d0dcde04b85b6dae45560cca988',ramdisk_id='',reservation_id='r-9lkcn4ga',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-192029678',owner_user_name='tempest-TestExecuteStrategies-192029678-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:55:19Z,user_data=None,user_id='0ae0bb20ac8b4be99eb1abddc7310436',uuid=6dc1f44b-7b59-497d-b8b4-46919828df13,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "08898a8d-58ea-4d2a-8141-17381d0867b2", "address": "fa:16:3e:a0:ad:31", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08898a8d-58", "ovs_interfaceid": "08898a8d-58ea-4d2a-8141-17381d0867b2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:55:27 np0005546954 nova_compute[187160]: 2025-12-05 12:55:27.544 187164 DEBUG nova.network.os_vif_util [None req-73882457-4df5-4064-a00c-a872bd74313a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converting VIF {"id": "08898a8d-58ea-4d2a-8141-17381d0867b2", "address": "fa:16:3e:a0:ad:31", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08898a8d-58", "ovs_interfaceid": "08898a8d-58ea-4d2a-8141-17381d0867b2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:55:27 np0005546954 nova_compute[187160]: 2025-12-05 12:55:27.545 187164 DEBUG nova.network.os_vif_util [None req-73882457-4df5-4064-a00c-a872bd74313a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a0:ad:31,bridge_name='br-int',has_traffic_filtering=True,id=08898a8d-58ea-4d2a-8141-17381d0867b2,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap08898a8d-58') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:55:27 np0005546954 nova_compute[187160]: 2025-12-05 12:55:27.546 187164 DEBUG os_vif [None req-73882457-4df5-4064-a00c-a872bd74313a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a0:ad:31,bridge_name='br-int',has_traffic_filtering=True,id=08898a8d-58ea-4d2a-8141-17381d0867b2,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap08898a8d-58') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:55:27 np0005546954 nova_compute[187160]: 2025-12-05 12:55:27.547 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:27 np0005546954 nova_compute[187160]: 2025-12-05 12:55:27.547 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap08898a8d-58, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:55:27 np0005546954 nova_compute[187160]: 2025-12-05 12:55:27.549 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:27 np0005546954 nova_compute[187160]: 2025-12-05 12:55:27.550 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:27 np0005546954 nova_compute[187160]: 2025-12-05 12:55:27.553 187164 INFO os_vif [None req-73882457-4df5-4064-a00c-a872bd74313a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a0:ad:31,bridge_name='br-int',has_traffic_filtering=True,id=08898a8d-58ea-4d2a-8141-17381d0867b2,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap08898a8d-58')#033[00m
Dec  5 07:55:27 np0005546954 nova_compute[187160]: 2025-12-05 12:55:27.554 187164 INFO nova.virt.libvirt.driver [None req-73882457-4df5-4064-a00c-a872bd74313a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 6dc1f44b-7b59-497d-b8b4-46919828df13] Deleting instance files /var/lib/nova/instances/6dc1f44b-7b59-497d-b8b4-46919828df13_del#033[00m
Dec  5 07:55:27 np0005546954 nova_compute[187160]: 2025-12-05 12:55:27.554 187164 INFO nova.virt.libvirt.driver [None req-73882457-4df5-4064-a00c-a872bd74313a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 6dc1f44b-7b59-497d-b8b4-46919828df13] Deletion of /var/lib/nova/instances/6dc1f44b-7b59-497d-b8b4-46919828df13_del complete#033[00m
Dec  5 07:55:27 np0005546954 podman[213563]: 2025-12-05 12:55:27.575397739 +0000 UTC m=+0.041989907 container remove 0a58a8d3765cc73cd77ec79e561200279323cd72fd33e295677fa7a38166f26a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true)
Dec  5 07:55:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:27.581 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[3026a93f-798e-4770-a475-b450bd91b10c]: (4, ('Fri Dec  5 12:55:27 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c (0a58a8d3765cc73cd77ec79e561200279323cd72fd33e295677fa7a38166f26a)\n0a58a8d3765cc73cd77ec79e561200279323cd72fd33e295677fa7a38166f26a\nFri Dec  5 12:55:27 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c (0a58a8d3765cc73cd77ec79e561200279323cd72fd33e295677fa7a38166f26a)\n0a58a8d3765cc73cd77ec79e561200279323cd72fd33e295677fa7a38166f26a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:55:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:27.582 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[e04f26da-7d1f-43c2-b396-a48e42e719ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:55:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:27.583 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4389bc8-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:55:27 np0005546954 nova_compute[187160]: 2025-12-05 12:55:27.584 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:27 np0005546954 kernel: tapd4389bc8-20: left promiscuous mode
Dec  5 07:55:27 np0005546954 nova_compute[187160]: 2025-12-05 12:55:27.600 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:27.603 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[24c4dcd8-1795-4858-8e4b-cb830035a483]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:55:27 np0005546954 nova_compute[187160]: 2025-12-05 12:55:27.616 187164 INFO nova.compute.manager [None req-73882457-4df5-4064-a00c-a872bd74313a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 6dc1f44b-7b59-497d-b8b4-46919828df13] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:55:27 np0005546954 nova_compute[187160]: 2025-12-05 12:55:27.617 187164 DEBUG oslo.service.loopingcall [None req-73882457-4df5-4064-a00c-a872bd74313a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:55:27 np0005546954 nova_compute[187160]: 2025-12-05 12:55:27.618 187164 DEBUG nova.compute.manager [-] [instance: 6dc1f44b-7b59-497d-b8b4-46919828df13] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:55:27 np0005546954 nova_compute[187160]: 2025-12-05 12:55:27.619 187164 DEBUG nova.network.neutron [-] [instance: 6dc1f44b-7b59-497d-b8b4-46919828df13] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:55:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:27.618 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[7dcc644a-e752-4445-a75d-a62e30673abf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:55:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:27.620 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[6d317c8a-72fc-4c23-84d3-65fbf191c1ec]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:55:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:27.635 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[3e64749d-8090-44a4-939f-53a4bfa3eed5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 428224, 'reachable_time': 42268, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213577, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:55:27 np0005546954 systemd[1]: run-netns-ovnmeta\x2dd4389bc8\x2d2898\x2d48b0\x2d9741\x2d5183b54fe83c.mount: Deactivated successfully.
Dec  5 07:55:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:27.644 104542 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:55:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:27.645 104542 DEBUG oslo.privsep.daemon [-] privsep: reply[756b4997-f651-442f-873b-3830e0e21dd4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:55:27 np0005546954 nova_compute[187160]: 2025-12-05 12:55:27.992 187164 DEBUG nova.compute.manager [req-d27e211d-98e5-4561-8bdc-be7feee7129b req-1c94832f-9177-485e-80b7-a3efe0e83f7a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Received event network-vif-plugged-e78f1ad2-5aff-40db-94db-beab9e8252a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:55:27 np0005546954 nova_compute[187160]: 2025-12-05 12:55:27.992 187164 DEBUG oslo_concurrency.lockutils [req-d27e211d-98e5-4561-8bdc-be7feee7129b req-1c94832f-9177-485e-80b7-a3efe0e83f7a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:55:27 np0005546954 nova_compute[187160]: 2025-12-05 12:55:27.992 187164 DEBUG oslo_concurrency.lockutils [req-d27e211d-98e5-4561-8bdc-be7feee7129b req-1c94832f-9177-485e-80b7-a3efe0e83f7a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:55:27 np0005546954 nova_compute[187160]: 2025-12-05 12:55:27.993 187164 DEBUG oslo_concurrency.lockutils [req-d27e211d-98e5-4561-8bdc-be7feee7129b req-1c94832f-9177-485e-80b7-a3efe0e83f7a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:55:27 np0005546954 nova_compute[187160]: 2025-12-05 12:55:27.993 187164 DEBUG nova.compute.manager [req-d27e211d-98e5-4561-8bdc-be7feee7129b req-1c94832f-9177-485e-80b7-a3efe0e83f7a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] No waiting events found dispatching network-vif-plugged-e78f1ad2-5aff-40db-94db-beab9e8252a9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:55:27 np0005546954 nova_compute[187160]: 2025-12-05 12:55:27.993 187164 WARNING nova.compute.manager [req-d27e211d-98e5-4561-8bdc-be7feee7129b req-1c94832f-9177-485e-80b7-a3efe0e83f7a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Received unexpected event network-vif-plugged-e78f1ad2-5aff-40db-94db-beab9e8252a9 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:55:27 np0005546954 nova_compute[187160]: 2025-12-05 12:55:27.993 187164 DEBUG nova.compute.manager [req-d27e211d-98e5-4561-8bdc-be7feee7129b req-1c94832f-9177-485e-80b7-a3efe0e83f7a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Received event network-vif-deleted-e78f1ad2-5aff-40db-94db-beab9e8252a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:55:27 np0005546954 nova_compute[187160]: 2025-12-05 12:55:27.994 187164 DEBUG nova.compute.manager [req-d27e211d-98e5-4561-8bdc-be7feee7129b req-1c94832f-9177-485e-80b7-a3efe0e83f7a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 6dc1f44b-7b59-497d-b8b4-46919828df13] Received event network-vif-unplugged-08898a8d-58ea-4d2a-8141-17381d0867b2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:55:27 np0005546954 nova_compute[187160]: 2025-12-05 12:55:27.994 187164 DEBUG oslo_concurrency.lockutils [req-d27e211d-98e5-4561-8bdc-be7feee7129b req-1c94832f-9177-485e-80b7-a3efe0e83f7a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "6dc1f44b-7b59-497d-b8b4-46919828df13-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:55:27 np0005546954 nova_compute[187160]: 2025-12-05 12:55:27.994 187164 DEBUG oslo_concurrency.lockutils [req-d27e211d-98e5-4561-8bdc-be7feee7129b req-1c94832f-9177-485e-80b7-a3efe0e83f7a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "6dc1f44b-7b59-497d-b8b4-46919828df13-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:55:27 np0005546954 nova_compute[187160]: 2025-12-05 12:55:27.994 187164 DEBUG oslo_concurrency.lockutils [req-d27e211d-98e5-4561-8bdc-be7feee7129b req-1c94832f-9177-485e-80b7-a3efe0e83f7a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "6dc1f44b-7b59-497d-b8b4-46919828df13-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:55:27 np0005546954 nova_compute[187160]: 2025-12-05 12:55:27.994 187164 DEBUG nova.compute.manager [req-d27e211d-98e5-4561-8bdc-be7feee7129b req-1c94832f-9177-485e-80b7-a3efe0e83f7a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 6dc1f44b-7b59-497d-b8b4-46919828df13] No waiting events found dispatching network-vif-unplugged-08898a8d-58ea-4d2a-8141-17381d0867b2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:55:27 np0005546954 nova_compute[187160]: 2025-12-05 12:55:27.995 187164 DEBUG nova.compute.manager [req-d27e211d-98e5-4561-8bdc-be7feee7129b req-1c94832f-9177-485e-80b7-a3efe0e83f7a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 6dc1f44b-7b59-497d-b8b4-46919828df13] Received event network-vif-unplugged-08898a8d-58ea-4d2a-8141-17381d0867b2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  5 07:55:28 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:55:28.122 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f9f74c-08f9-451f-9678-93bb9e8fa2fe, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:55:28 np0005546954 nova_compute[187160]: 2025-12-05 12:55:28.661 187164 DEBUG nova.network.neutron [-] [instance: 6dc1f44b-7b59-497d-b8b4-46919828df13] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:55:28 np0005546954 nova_compute[187160]: 2025-12-05 12:55:28.681 187164 INFO nova.compute.manager [-] [instance: 6dc1f44b-7b59-497d-b8b4-46919828df13] Took 1.06 seconds to deallocate network for instance.#033[00m
Dec  5 07:55:28 np0005546954 nova_compute[187160]: 2025-12-05 12:55:28.736 187164 DEBUG oslo_concurrency.lockutils [None req-73882457-4df5-4064-a00c-a872bd74313a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:55:28 np0005546954 nova_compute[187160]: 2025-12-05 12:55:28.737 187164 DEBUG oslo_concurrency.lockutils [None req-73882457-4df5-4064-a00c-a872bd74313a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:55:28 np0005546954 nova_compute[187160]: 2025-12-05 12:55:28.741 187164 DEBUG oslo_concurrency.lockutils [None req-73882457-4df5-4064-a00c-a872bd74313a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:55:28 np0005546954 nova_compute[187160]: 2025-12-05 12:55:28.765 187164 INFO nova.scheduler.client.report [None req-73882457-4df5-4064-a00c-a872bd74313a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Deleted allocations for instance 6dc1f44b-7b59-497d-b8b4-46919828df13#033[00m
Dec  5 07:55:28 np0005546954 nova_compute[187160]: 2025-12-05 12:55:28.820 187164 DEBUG oslo_concurrency.lockutils [None req-73882457-4df5-4064-a00c-a872bd74313a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "6dc1f44b-7b59-497d-b8b4-46919828df13" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.571s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:55:30 np0005546954 nova_compute[187160]: 2025-12-05 12:55:30.085 187164 DEBUG nova.compute.manager [req-877ade41-b6cd-407a-af6d-f3bf8422a030 req-55a49ee4-a928-482d-94c0-3a0b046b8d9d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 6dc1f44b-7b59-497d-b8b4-46919828df13] Received event network-vif-plugged-08898a8d-58ea-4d2a-8141-17381d0867b2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:55:30 np0005546954 nova_compute[187160]: 2025-12-05 12:55:30.085 187164 DEBUG oslo_concurrency.lockutils [req-877ade41-b6cd-407a-af6d-f3bf8422a030 req-55a49ee4-a928-482d-94c0-3a0b046b8d9d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "6dc1f44b-7b59-497d-b8b4-46919828df13-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:55:30 np0005546954 nova_compute[187160]: 2025-12-05 12:55:30.086 187164 DEBUG oslo_concurrency.lockutils [req-877ade41-b6cd-407a-af6d-f3bf8422a030 req-55a49ee4-a928-482d-94c0-3a0b046b8d9d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "6dc1f44b-7b59-497d-b8b4-46919828df13-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:55:30 np0005546954 nova_compute[187160]: 2025-12-05 12:55:30.086 187164 DEBUG oslo_concurrency.lockutils [req-877ade41-b6cd-407a-af6d-f3bf8422a030 req-55a49ee4-a928-482d-94c0-3a0b046b8d9d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "6dc1f44b-7b59-497d-b8b4-46919828df13-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:55:30 np0005546954 nova_compute[187160]: 2025-12-05 12:55:30.086 187164 DEBUG nova.compute.manager [req-877ade41-b6cd-407a-af6d-f3bf8422a030 req-55a49ee4-a928-482d-94c0-3a0b046b8d9d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 6dc1f44b-7b59-497d-b8b4-46919828df13] No waiting events found dispatching network-vif-plugged-08898a8d-58ea-4d2a-8141-17381d0867b2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:55:30 np0005546954 nova_compute[187160]: 2025-12-05 12:55:30.087 187164 WARNING nova.compute.manager [req-877ade41-b6cd-407a-af6d-f3bf8422a030 req-55a49ee4-a928-482d-94c0-3a0b046b8d9d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 6dc1f44b-7b59-497d-b8b4-46919828df13] Received unexpected event network-vif-plugged-08898a8d-58ea-4d2a-8141-17381d0867b2 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:55:30 np0005546954 nova_compute[187160]: 2025-12-05 12:55:30.087 187164 DEBUG nova.compute.manager [req-877ade41-b6cd-407a-af6d-f3bf8422a030 req-55a49ee4-a928-482d-94c0-3a0b046b8d9d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 6dc1f44b-7b59-497d-b8b4-46919828df13] Received event network-vif-deleted-08898a8d-58ea-4d2a-8141-17381d0867b2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:55:30 np0005546954 nova_compute[187160]: 2025-12-05 12:55:30.459 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:31 np0005546954 podman[213581]: 2025-12-05 12:55:31.562289499 +0000 UTC m=+0.062861882 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  5 07:55:31 np0005546954 podman[213580]: 2025-12-05 12:55:31.60573103 +0000 UTC m=+0.106286762 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec  5 07:55:32 np0005546954 nova_compute[187160]: 2025-12-05 12:55:32.599 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:35 np0005546954 nova_compute[187160]: 2025-12-05 12:55:35.461 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:35 np0005546954 podman[197513]: time="2025-12-05T12:55:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:55:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:55:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 07:55:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:55:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2590 "" "Go-http-client/1.1"
Dec  5 07:55:37 np0005546954 nova_compute[187160]: 2025-12-05 12:55:37.645 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:40 np0005546954 nova_compute[187160]: 2025-12-05 12:55:40.205 187164 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764939325.2036648, 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:55:40 np0005546954 nova_compute[187160]: 2025-12-05 12:55:40.206 187164 INFO nova.compute.manager [-] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:55:40 np0005546954 nova_compute[187160]: 2025-12-05 12:55:40.239 187164 DEBUG nova.compute.manager [None req-07ebdf97-fe61-4faf-abd8-2e964fd622ad - - - - - -] [instance: 245e0b98-27dd-4ad1-b5ea-7b4bf98c66a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:55:40 np0005546954 nova_compute[187160]: 2025-12-05 12:55:40.488 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:42 np0005546954 nova_compute[187160]: 2025-12-05 12:55:42.525 187164 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764939327.5235572, 6dc1f44b-7b59-497d-b8b4-46919828df13 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:55:42 np0005546954 nova_compute[187160]: 2025-12-05 12:55:42.526 187164 INFO nova.compute.manager [-] [instance: 6dc1f44b-7b59-497d-b8b4-46919828df13] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:55:42 np0005546954 nova_compute[187160]: 2025-12-05 12:55:42.549 187164 DEBUG nova.compute.manager [None req-04e14602-152e-4a64-8afe-7b6389ca9473 - - - - - -] [instance: 6dc1f44b-7b59-497d-b8b4-46919828df13] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:55:42 np0005546954 nova_compute[187160]: 2025-12-05 12:55:42.647 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:44 np0005546954 podman[213631]: 2025-12-05 12:55:44.590203562 +0000 UTC m=+0.089048753 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, config_id=edpm, maintainer=Red Hat, Inc., name=ubi9-minimal)
Dec  5 07:55:44 np0005546954 podman[213632]: 2025-12-05 12:55:44.605077358 +0000 UTC m=+0.097860228 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  5 07:55:45 np0005546954 nova_compute[187160]: 2025-12-05 12:55:45.490 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:47 np0005546954 nova_compute[187160]: 2025-12-05 12:55:47.651 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:55:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:55:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:55:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:55:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:55:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 07:55:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:55:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 07:55:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:55:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:55:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 07:55:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:55:50 np0005546954 nova_compute[187160]: 2025-12-05 12:55:50.508 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:52 np0005546954 nova_compute[187160]: 2025-12-05 12:55:52.653 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:55 np0005546954 nova_compute[187160]: 2025-12-05 12:55:55.551 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:57 np0005546954 podman[213675]: 2025-12-05 12:55:57.595707272 +0000 UTC m=+0.096973269 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec  5 07:55:57 np0005546954 nova_compute[187160]: 2025-12-05 12:55:57.657 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:55:58 np0005546954 ovn_controller[95566]: 2025-12-05T12:55:58Z|00151|memory_trim|INFO|Detected inactivity (last active 30010 ms ago): trimming memory
Dec  5 07:56:00 np0005546954 nova_compute[187160]: 2025-12-05 12:56:00.041 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:56:00 np0005546954 nova_compute[187160]: 2025-12-05 12:56:00.041 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:56:00 np0005546954 nova_compute[187160]: 2025-12-05 12:56:00.042 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:56:00 np0005546954 nova_compute[187160]: 2025-12-05 12:56:00.058 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 07:56:00 np0005546954 nova_compute[187160]: 2025-12-05 12:56:00.058 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:56:00 np0005546954 nova_compute[187160]: 2025-12-05 12:56:00.553 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:56:01 np0005546954 nova_compute[187160]: 2025-12-05 12:56:01.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:56:02 np0005546954 podman[213696]: 2025-12-05 12:56:02.606425034 +0000 UTC m=+0.103400961 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  5 07:56:02 np0005546954 podman[213695]: 2025-12-05 12:56:02.631616863 +0000 UTC m=+0.134883898 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:56:02 np0005546954 nova_compute[187160]: 2025-12-05 12:56:02.659 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:56:05 np0005546954 nova_compute[187160]: 2025-12-05 12:56:05.034 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:56:05 np0005546954 nova_compute[187160]: 2025-12-05 12:56:05.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:56:05 np0005546954 nova_compute[187160]: 2025-12-05 12:56:05.598 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:56:05 np0005546954 podman[197513]: time="2025-12-05T12:56:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:56:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:56:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 07:56:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:56:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2589 "" "Go-http-client/1.1"
Dec  5 07:56:07 np0005546954 nova_compute[187160]: 2025-12-05 12:56:07.663 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:56:08 np0005546954 nova_compute[187160]: 2025-12-05 12:56:08.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:56:08 np0005546954 nova_compute[187160]: 2025-12-05 12:56:08.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:56:08 np0005546954 nova_compute[187160]: 2025-12-05 12:56:08.062 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:56:08 np0005546954 nova_compute[187160]: 2025-12-05 12:56:08.063 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:56:08 np0005546954 nova_compute[187160]: 2025-12-05 12:56:08.063 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:56:08 np0005546954 nova_compute[187160]: 2025-12-05 12:56:08.063 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:56:08 np0005546954 nova_compute[187160]: 2025-12-05 12:56:08.212 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:56:08 np0005546954 nova_compute[187160]: 2025-12-05 12:56:08.212 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5877MB free_disk=73.33618927001953GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:56:08 np0005546954 nova_compute[187160]: 2025-12-05 12:56:08.213 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:56:08 np0005546954 nova_compute[187160]: 2025-12-05 12:56:08.213 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:56:08 np0005546954 nova_compute[187160]: 2025-12-05 12:56:08.284 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:56:08 np0005546954 nova_compute[187160]: 2025-12-05 12:56:08.284 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:56:08 np0005546954 nova_compute[187160]: 2025-12-05 12:56:08.309 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:56:08 np0005546954 nova_compute[187160]: 2025-12-05 12:56:08.365 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:56:08 np0005546954 nova_compute[187160]: 2025-12-05 12:56:08.390 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:56:08 np0005546954 nova_compute[187160]: 2025-12-05 12:56:08.390 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:56:10 np0005546954 nova_compute[187160]: 2025-12-05 12:56:10.598 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:56:11 np0005546954 nova_compute[187160]: 2025-12-05 12:56:11.599 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:56:11 np0005546954 nova_compute[187160]: 2025-12-05 12:56:11.600 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:56:12 np0005546954 nova_compute[187160]: 2025-12-05 12:56:12.666 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:56:13 np0005546954 nova_compute[187160]: 2025-12-05 12:56:13.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:56:15 np0005546954 podman[213749]: 2025-12-05 12:56:15.587227409 +0000 UTC m=+0.088689210 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, maintainer=Red Hat, Inc., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc.)
Dec  5 07:56:15 np0005546954 podman[213750]: 2025-12-05 12:56:15.589996176 +0000 UTC m=+0.089300100 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:56:15 np0005546954 nova_compute[187160]: 2025-12-05 12:56:15.599 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:56:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:56:16.955 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:56:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:56:16.957 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:56:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:56:16.957 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:56:17 np0005546954 nova_compute[187160]: 2025-12-05 12:56:17.669 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:56:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:56:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:56:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:56:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 07:56:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:56:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:56:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:56:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 07:56:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:56:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:56:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 07:56:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:56:20 np0005546954 nova_compute[187160]: 2025-12-05 12:56:20.024 187164 DEBUG oslo_concurrency.lockutils [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "4e3508c1-ecaa-442a-95c5-f5095e12912e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:56:20 np0005546954 nova_compute[187160]: 2025-12-05 12:56:20.025 187164 DEBUG oslo_concurrency.lockutils [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "4e3508c1-ecaa-442a-95c5-f5095e12912e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:56:20 np0005546954 nova_compute[187160]: 2025-12-05 12:56:20.044 187164 DEBUG nova.compute.manager [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:56:20 np0005546954 nova_compute[187160]: 2025-12-05 12:56:20.162 187164 DEBUG oslo_concurrency.lockutils [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:56:20 np0005546954 nova_compute[187160]: 2025-12-05 12:56:20.162 187164 DEBUG oslo_concurrency.lockutils [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:56:20 np0005546954 nova_compute[187160]: 2025-12-05 12:56:20.169 187164 DEBUG nova.virt.hardware [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:56:20 np0005546954 nova_compute[187160]: 2025-12-05 12:56:20.169 187164 INFO nova.compute.claims [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Claim successful on node compute-1.ctlplane.example.com#033[00m
Dec  5 07:56:20 np0005546954 nova_compute[187160]: 2025-12-05 12:56:20.303 187164 DEBUG nova.compute.provider_tree [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:56:20 np0005546954 nova_compute[187160]: 2025-12-05 12:56:20.323 187164 DEBUG nova.scheduler.client.report [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:56:20 np0005546954 nova_compute[187160]: 2025-12-05 12:56:20.350 187164 DEBUG oslo_concurrency.lockutils [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.188s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:56:20 np0005546954 nova_compute[187160]: 2025-12-05 12:56:20.351 187164 DEBUG nova.compute.manager [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:56:20 np0005546954 nova_compute[187160]: 2025-12-05 12:56:20.423 187164 DEBUG nova.compute.manager [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:56:20 np0005546954 nova_compute[187160]: 2025-12-05 12:56:20.424 187164 DEBUG nova.network.neutron [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:56:20 np0005546954 nova_compute[187160]: 2025-12-05 12:56:20.444 187164 INFO nova.virt.libvirt.driver [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:56:20 np0005546954 nova_compute[187160]: 2025-12-05 12:56:20.469 187164 DEBUG nova.compute.manager [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:56:20 np0005546954 nova_compute[187160]: 2025-12-05 12:56:20.580 187164 DEBUG nova.compute.manager [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:56:20 np0005546954 nova_compute[187160]: 2025-12-05 12:56:20.582 187164 DEBUG nova.virt.libvirt.driver [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:56:20 np0005546954 nova_compute[187160]: 2025-12-05 12:56:20.583 187164 INFO nova.virt.libvirt.driver [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Creating image(s)#033[00m
Dec  5 07:56:20 np0005546954 nova_compute[187160]: 2025-12-05 12:56:20.584 187164 DEBUG oslo_concurrency.lockutils [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "/var/lib/nova/instances/4e3508c1-ecaa-442a-95c5-f5095e12912e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:56:20 np0005546954 nova_compute[187160]: 2025-12-05 12:56:20.585 187164 DEBUG oslo_concurrency.lockutils [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "/var/lib/nova/instances/4e3508c1-ecaa-442a-95c5-f5095e12912e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:56:20 np0005546954 nova_compute[187160]: 2025-12-05 12:56:20.586 187164 DEBUG oslo_concurrency.lockutils [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "/var/lib/nova/instances/4e3508c1-ecaa-442a-95c5-f5095e12912e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:56:20 np0005546954 nova_compute[187160]: 2025-12-05 12:56:20.612 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:56:20 np0005546954 nova_compute[187160]: 2025-12-05 12:56:20.616 187164 DEBUG oslo_concurrency.processutils [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:56:20 np0005546954 nova_compute[187160]: 2025-12-05 12:56:20.644 187164 DEBUG nova.policy [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0ae0bb20ac8b4be99eb1abddc7310436', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e6ae0d0dcde04b85b6dae45560cca988', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:56:20 np0005546954 nova_compute[187160]: 2025-12-05 12:56:20.696 187164 DEBUG oslo_concurrency.processutils [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:56:20 np0005546954 nova_compute[187160]: 2025-12-05 12:56:20.698 187164 DEBUG oslo_concurrency.lockutils [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:56:20 np0005546954 nova_compute[187160]: 2025-12-05 12:56:20.699 187164 DEBUG oslo_concurrency.lockutils [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:56:20 np0005546954 nova_compute[187160]: 2025-12-05 12:56:20.723 187164 DEBUG oslo_concurrency.processutils [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:56:20 np0005546954 nova_compute[187160]: 2025-12-05 12:56:20.783 187164 DEBUG oslo_concurrency.processutils [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:56:20 np0005546954 nova_compute[187160]: 2025-12-05 12:56:20.784 187164 DEBUG oslo_concurrency.processutils [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/4e3508c1-ecaa-442a-95c5-f5095e12912e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:56:20 np0005546954 nova_compute[187160]: 2025-12-05 12:56:20.833 187164 DEBUG oslo_concurrency.processutils [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/4e3508c1-ecaa-442a-95c5-f5095e12912e/disk 1073741824" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:56:20 np0005546954 nova_compute[187160]: 2025-12-05 12:56:20.834 187164 DEBUG oslo_concurrency.lockutils [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:56:20 np0005546954 nova_compute[187160]: 2025-12-05 12:56:20.835 187164 DEBUG oslo_concurrency.processutils [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:56:20 np0005546954 nova_compute[187160]: 2025-12-05 12:56:20.891 187164 DEBUG oslo_concurrency.processutils [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:56:20 np0005546954 nova_compute[187160]: 2025-12-05 12:56:20.893 187164 DEBUG nova.virt.disk.api [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Checking if we can resize image /var/lib/nova/instances/4e3508c1-ecaa-442a-95c5-f5095e12912e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:56:20 np0005546954 nova_compute[187160]: 2025-12-05 12:56:20.894 187164 DEBUG oslo_concurrency.processutils [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e3508c1-ecaa-442a-95c5-f5095e12912e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:56:20 np0005546954 nova_compute[187160]: 2025-12-05 12:56:20.974 187164 DEBUG oslo_concurrency.processutils [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e3508c1-ecaa-442a-95c5-f5095e12912e/disk --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:56:20 np0005546954 nova_compute[187160]: 2025-12-05 12:56:20.975 187164 DEBUG nova.virt.disk.api [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Cannot resize image /var/lib/nova/instances/4e3508c1-ecaa-442a-95c5-f5095e12912e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:56:20 np0005546954 nova_compute[187160]: 2025-12-05 12:56:20.976 187164 DEBUG nova.objects.instance [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lazy-loading 'migration_context' on Instance uuid 4e3508c1-ecaa-442a-95c5-f5095e12912e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:56:20 np0005546954 nova_compute[187160]: 2025-12-05 12:56:20.989 187164 DEBUG nova.virt.libvirt.driver [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:56:20 np0005546954 nova_compute[187160]: 2025-12-05 12:56:20.990 187164 DEBUG nova.virt.libvirt.driver [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Ensure instance console log exists: /var/lib/nova/instances/4e3508c1-ecaa-442a-95c5-f5095e12912e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:56:20 np0005546954 nova_compute[187160]: 2025-12-05 12:56:20.990 187164 DEBUG oslo_concurrency.lockutils [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:56:20 np0005546954 nova_compute[187160]: 2025-12-05 12:56:20.991 187164 DEBUG oslo_concurrency.lockutils [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:56:20 np0005546954 nova_compute[187160]: 2025-12-05 12:56:20.992 187164 DEBUG oslo_concurrency.lockutils [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:56:21 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:56:21.245 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2a:56:46', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '32:90:88:ab:74:32'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:56:21 np0005546954 nova_compute[187160]: 2025-12-05 12:56:21.245 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:56:21 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:56:21.247 104428 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 07:56:21 np0005546954 nova_compute[187160]: 2025-12-05 12:56:21.349 187164 DEBUG nova.network.neutron [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Successfully created port: 74c7c063-9c46-4bde-8e16-cab6f4e2e23c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:56:22 np0005546954 nova_compute[187160]: 2025-12-05 12:56:22.672 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:56:23 np0005546954 nova_compute[187160]: 2025-12-05 12:56:23.786 187164 DEBUG nova.network.neutron [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Successfully updated port: 74c7c063-9c46-4bde-8e16-cab6f4e2e23c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:56:23 np0005546954 nova_compute[187160]: 2025-12-05 12:56:23.804 187164 DEBUG oslo_concurrency.lockutils [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "refresh_cache-4e3508c1-ecaa-442a-95c5-f5095e12912e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:56:23 np0005546954 nova_compute[187160]: 2025-12-05 12:56:23.804 187164 DEBUG oslo_concurrency.lockutils [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquired lock "refresh_cache-4e3508c1-ecaa-442a-95c5-f5095e12912e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:56:23 np0005546954 nova_compute[187160]: 2025-12-05 12:56:23.805 187164 DEBUG nova.network.neutron [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:56:23 np0005546954 nova_compute[187160]: 2025-12-05 12:56:23.973 187164 DEBUG nova.compute.manager [req-2efdfa9e-9045-4e64-8409-5c63553c6477 req-218daac4-12db-4525-9116-017ad809327d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Received event network-changed-74c7c063-9c46-4bde-8e16-cab6f4e2e23c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:56:23 np0005546954 nova_compute[187160]: 2025-12-05 12:56:23.973 187164 DEBUG nova.compute.manager [req-2efdfa9e-9045-4e64-8409-5c63553c6477 req-218daac4-12db-4525-9116-017ad809327d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Refreshing instance network info cache due to event network-changed-74c7c063-9c46-4bde-8e16-cab6f4e2e23c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:56:23 np0005546954 nova_compute[187160]: 2025-12-05 12:56:23.973 187164 DEBUG oslo_concurrency.lockutils [req-2efdfa9e-9045-4e64-8409-5c63553c6477 req-218daac4-12db-4525-9116-017ad809327d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "refresh_cache-4e3508c1-ecaa-442a-95c5-f5095e12912e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:56:24 np0005546954 nova_compute[187160]: 2025-12-05 12:56:24.653 187164 DEBUG nova.network.neutron [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:56:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:56:25.249 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f9f74c-08f9-451f-9678-93bb9e8fa2fe, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:56:25 np0005546954 nova_compute[187160]: 2025-12-05 12:56:25.603 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:56:25 np0005546954 nova_compute[187160]: 2025-12-05 12:56:25.976 187164 DEBUG nova.network.neutron [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Updating instance_info_cache with network_info: [{"id": "74c7c063-9c46-4bde-8e16-cab6f4e2e23c", "address": "fa:16:3e:83:2c:4d", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74c7c063-9c", "ovs_interfaceid": "74c7c063-9c46-4bde-8e16-cab6f4e2e23c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.015 187164 DEBUG oslo_concurrency.lockutils [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Releasing lock "refresh_cache-4e3508c1-ecaa-442a-95c5-f5095e12912e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.015 187164 DEBUG nova.compute.manager [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Instance network_info: |[{"id": "74c7c063-9c46-4bde-8e16-cab6f4e2e23c", "address": "fa:16:3e:83:2c:4d", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74c7c063-9c", "ovs_interfaceid": "74c7c063-9c46-4bde-8e16-cab6f4e2e23c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.016 187164 DEBUG oslo_concurrency.lockutils [req-2efdfa9e-9045-4e64-8409-5c63553c6477 req-218daac4-12db-4525-9116-017ad809327d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquired lock "refresh_cache-4e3508c1-ecaa-442a-95c5-f5095e12912e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.016 187164 DEBUG nova.network.neutron [req-2efdfa9e-9045-4e64-8409-5c63553c6477 req-218daac4-12db-4525-9116-017ad809327d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Refreshing network info cache for port 74c7c063-9c46-4bde-8e16-cab6f4e2e23c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.018 187164 DEBUG nova.virt.libvirt.driver [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Start _get_guest_xml network_info=[{"id": "74c7c063-9c46-4bde-8e16-cab6f4e2e23c", "address": "fa:16:3e:83:2c:4d", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74c7c063-9c", "ovs_interfaceid": "74c7c063-9c46-4bde-8e16-cab6f4e2e23c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T12:39:17Z,direct_url=<?>,disk_format='qcow2',id=f4c3125a-6fd0-40bb-aa00-a7e736ee853d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='83916c53de6f404f91206339303e1b23',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T12:39:21Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'size': 0, 'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'encrypted': False, 'image_id': 'f4c3125a-6fd0-40bb-aa00-a7e736ee853d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.023 187164 WARNING nova.virt.libvirt.driver [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.028 187164 DEBUG nova.virt.libvirt.host [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.028 187164 DEBUG nova.virt.libvirt.host [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.033 187164 DEBUG nova.virt.libvirt.host [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.034 187164 DEBUG nova.virt.libvirt.host [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.035 187164 DEBUG nova.virt.libvirt.driver [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.035 187164 DEBUG nova.virt.hardware [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T12:39:15Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b4ea63be-97f8-4a48-b000-66321c4ddb27',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T12:39:17Z,direct_url=<?>,disk_format='qcow2',id=f4c3125a-6fd0-40bb-aa00-a7e736ee853d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='83916c53de6f404f91206339303e1b23',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T12:39:21Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.035 187164 DEBUG nova.virt.hardware [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.035 187164 DEBUG nova.virt.hardware [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.036 187164 DEBUG nova.virt.hardware [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.036 187164 DEBUG nova.virt.hardware [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.036 187164 DEBUG nova.virt.hardware [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.036 187164 DEBUG nova.virt.hardware [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.036 187164 DEBUG nova.virt.hardware [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.037 187164 DEBUG nova.virt.hardware [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.037 187164 DEBUG nova.virt.hardware [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.037 187164 DEBUG nova.virt.hardware [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.040 187164 DEBUG nova.virt.libvirt.vif [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:56:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-456483940',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-456483940',id=16,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e6ae0d0dcde04b85b6dae45560cca988',ramdisk_id='',reservation_id='r-u55jv8k1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-192029678',owner_user_name='tempest-TestExecuteStrategies-192029678-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:56:20Z,user_data=None,user_id='0ae0bb20ac8b4be99eb1abddc7310436',uuid=4e3508c1-ecaa-442a-95c5-f5095e12912e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "74c7c063-9c46-4bde-8e16-cab6f4e2e23c", "address": "fa:16:3e:83:2c:4d", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74c7c063-9c", "ovs_interfaceid": "74c7c063-9c46-4bde-8e16-cab6f4e2e23c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.040 187164 DEBUG nova.network.os_vif_util [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converting VIF {"id": "74c7c063-9c46-4bde-8e16-cab6f4e2e23c", "address": "fa:16:3e:83:2c:4d", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74c7c063-9c", "ovs_interfaceid": "74c7c063-9c46-4bde-8e16-cab6f4e2e23c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.041 187164 DEBUG nova.network.os_vif_util [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:83:2c:4d,bridge_name='br-int',has_traffic_filtering=True,id=74c7c063-9c46-4bde-8e16-cab6f4e2e23c,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74c7c063-9c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.042 187164 DEBUG nova.objects.instance [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4e3508c1-ecaa-442a-95c5-f5095e12912e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.063 187164 DEBUG nova.virt.libvirt.driver [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:56:26 np0005546954 nova_compute[187160]:  <uuid>4e3508c1-ecaa-442a-95c5-f5095e12912e</uuid>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:  <name>instance-00000010</name>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:  <memory>131072</memory>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:  <vcpu>1</vcpu>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:  <metadata>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:56:26 np0005546954 nova_compute[187160]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:      <nova:name>tempest-TestExecuteStrategies-server-456483940</nova:name>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:      <nova:creationTime>2025-12-05 12:56:26</nova:creationTime>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:      <nova:flavor name="m1.nano">
Dec  5 07:56:26 np0005546954 nova_compute[187160]:        <nova:memory>128</nova:memory>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:        <nova:disk>1</nova:disk>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:        <nova:swap>0</nova:swap>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:      </nova:flavor>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:      <nova:owner>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:        <nova:user uuid="0ae0bb20ac8b4be99eb1abddc7310436">tempest-TestExecuteStrategies-192029678-project-member</nova:user>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:        <nova:project uuid="e6ae0d0dcde04b85b6dae45560cca988">tempest-TestExecuteStrategies-192029678</nova:project>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:      </nova:owner>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:      <nova:root type="image" uuid="f4c3125a-6fd0-40bb-aa00-a7e736ee853d"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:      <nova:ports>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:        <nova:port uuid="74c7c063-9c46-4bde-8e16-cab6f4e2e23c">
Dec  5 07:56:26 np0005546954 nova_compute[187160]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:        </nova:port>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:      </nova:ports>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    </nova:instance>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:  </metadata>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:  <sysinfo type="smbios">
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <system>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:      <entry name="serial">4e3508c1-ecaa-442a-95c5-f5095e12912e</entry>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:      <entry name="uuid">4e3508c1-ecaa-442a-95c5-f5095e12912e</entry>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    </system>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:  </sysinfo>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:  <os>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <boot dev="hd"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <smbios mode="sysinfo"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:  </os>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:  <features>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <acpi/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <apic/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <vmcoreinfo/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:  </features>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:  <clock offset="utc">
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <timer name="hpet" present="no"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:  </clock>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:  <cpu mode="custom" match="exact">
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <model>Nehalem</model>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:  </cpu>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:  <devices>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <disk type="file" device="disk">
Dec  5 07:56:26 np0005546954 nova_compute[187160]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:      <source file="/var/lib/nova/instances/4e3508c1-ecaa-442a-95c5-f5095e12912e/disk"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:      <target dev="vda" bus="virtio"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    </disk>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <disk type="file" device="cdrom">
Dec  5 07:56:26 np0005546954 nova_compute[187160]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:      <source file="/var/lib/nova/instances/4e3508c1-ecaa-442a-95c5-f5095e12912e/disk.config"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:      <target dev="sda" bus="sata"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    </disk>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <interface type="ethernet">
Dec  5 07:56:26 np0005546954 nova_compute[187160]:      <mac address="fa:16:3e:83:2c:4d"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:      <model type="virtio"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:      <mtu size="1442"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:      <target dev="tap74c7c063-9c"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    </interface>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <serial type="pty">
Dec  5 07:56:26 np0005546954 nova_compute[187160]:      <log file="/var/lib/nova/instances/4e3508c1-ecaa-442a-95c5-f5095e12912e/console.log" append="off"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    </serial>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <video>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:      <model type="virtio"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    </video>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <input type="tablet" bus="usb"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <rng model="virtio">
Dec  5 07:56:26 np0005546954 nova_compute[187160]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    </rng>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <controller type="usb" index="0"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    <memballoon model="virtio">
Dec  5 07:56:26 np0005546954 nova_compute[187160]:      <stats period="10"/>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:    </memballoon>
Dec  5 07:56:26 np0005546954 nova_compute[187160]:  </devices>
Dec  5 07:56:26 np0005546954 nova_compute[187160]: </domain>
Dec  5 07:56:26 np0005546954 nova_compute[187160]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.064 187164 DEBUG nova.compute.manager [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Preparing to wait for external event network-vif-plugged-74c7c063-9c46-4bde-8e16-cab6f4e2e23c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.065 187164 DEBUG oslo_concurrency.lockutils [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "4e3508c1-ecaa-442a-95c5-f5095e12912e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.065 187164 DEBUG oslo_concurrency.lockutils [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "4e3508c1-ecaa-442a-95c5-f5095e12912e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.065 187164 DEBUG oslo_concurrency.lockutils [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "4e3508c1-ecaa-442a-95c5-f5095e12912e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.066 187164 DEBUG nova.virt.libvirt.vif [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:56:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-456483940',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-456483940',id=16,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e6ae0d0dcde04b85b6dae45560cca988',ramdisk_id='',reservation_id='r-u55jv8k1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-192029678',owner_user_name='tempest-TestExecuteStrategies-192029678-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:56:20Z,user_data=None,user_id='0ae0bb20ac8b4be99eb1abddc7310436',uuid=4e3508c1-ecaa-442a-95c5-f5095e12912e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "74c7c063-9c46-4bde-8e16-cab6f4e2e23c", "address": "fa:16:3e:83:2c:4d", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74c7c063-9c", "ovs_interfaceid": "74c7c063-9c46-4bde-8e16-cab6f4e2e23c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.066 187164 DEBUG nova.network.os_vif_util [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converting VIF {"id": "74c7c063-9c46-4bde-8e16-cab6f4e2e23c", "address": "fa:16:3e:83:2c:4d", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74c7c063-9c", "ovs_interfaceid": "74c7c063-9c46-4bde-8e16-cab6f4e2e23c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.066 187164 DEBUG nova.network.os_vif_util [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:83:2c:4d,bridge_name='br-int',has_traffic_filtering=True,id=74c7c063-9c46-4bde-8e16-cab6f4e2e23c,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74c7c063-9c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.067 187164 DEBUG os_vif [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:2c:4d,bridge_name='br-int',has_traffic_filtering=True,id=74c7c063-9c46-4bde-8e16-cab6f4e2e23c,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74c7c063-9c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.067 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.067 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.068 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.070 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.070 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap74c7c063-9c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.070 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap74c7c063-9c, col_values=(('external_ids', {'iface-id': '74c7c063-9c46-4bde-8e16-cab6f4e2e23c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:83:2c:4d', 'vm-uuid': '4e3508c1-ecaa-442a-95c5-f5095e12912e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:56:26 np0005546954 NetworkManager[55665]: <info>  [1764939386.0726] manager: (tap74c7c063-9c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/61)
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.074 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.077 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.078 187164 INFO os_vif [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:2c:4d,bridge_name='br-int',has_traffic_filtering=True,id=74c7c063-9c46-4bde-8e16-cab6f4e2e23c,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74c7c063-9c')#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.166 187164 DEBUG nova.virt.libvirt.driver [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.167 187164 DEBUG nova.virt.libvirt.driver [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.167 187164 DEBUG nova.virt.libvirt.driver [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] No VIF found with MAC fa:16:3e:83:2c:4d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.167 187164 INFO nova.virt.libvirt.driver [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Using config drive#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.825 187164 INFO nova.virt.libvirt.driver [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Creating config drive at /var/lib/nova/instances/4e3508c1-ecaa-442a-95c5-f5095e12912e/disk.config#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.833 187164 DEBUG oslo_concurrency.processutils [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4e3508c1-ecaa-442a-95c5-f5095e12912e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwz76c6tt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:56:26 np0005546954 nova_compute[187160]: 2025-12-05 12:56:26.968 187164 DEBUG oslo_concurrency.processutils [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4e3508c1-ecaa-442a-95c5-f5095e12912e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwz76c6tt" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:56:27 np0005546954 kernel: tap74c7c063-9c: entered promiscuous mode
Dec  5 07:56:27 np0005546954 NetworkManager[55665]: <info>  [1764939387.0282] manager: (tap74c7c063-9c): new Tun device (/org/freedesktop/NetworkManager/Devices/62)
Dec  5 07:56:27 np0005546954 ovn_controller[95566]: 2025-12-05T12:56:27Z|00152|binding|INFO|Claiming lport 74c7c063-9c46-4bde-8e16-cab6f4e2e23c for this chassis.
Dec  5 07:56:27 np0005546954 ovn_controller[95566]: 2025-12-05T12:56:27Z|00153|binding|INFO|74c7c063-9c46-4bde-8e16-cab6f4e2e23c: Claiming fa:16:3e:83:2c:4d 10.100.0.10
Dec  5 07:56:27 np0005546954 nova_compute[187160]: 2025-12-05 12:56:27.030 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:56:27 np0005546954 ovn_controller[95566]: 2025-12-05T12:56:27Z|00154|binding|INFO|Setting lport 74c7c063-9c46-4bde-8e16-cab6f4e2e23c ovn-installed in OVS
Dec  5 07:56:27 np0005546954 ovn_controller[95566]: 2025-12-05T12:56:27Z|00155|binding|INFO|Setting lport 74c7c063-9c46-4bde-8e16-cab6f4e2e23c up in Southbound
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:56:27.150 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:83:2c:4d 10.100.0.10'], port_security=['fa:16:3e:83:2c:4d 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '4e3508c1-ecaa-442a-95c5-f5095e12912e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4389bc8-2898-48b0-9741-5183b54fe83c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e6ae0d0dcde04b85b6dae45560cca988', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9ea68f98-ae7c-4c35-bc5a-7c1a27f7e5f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb60c317-acba-4c06-b29b-f7c6c7a5660a, chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=74c7c063-9c46-4bde-8e16-cab6f4e2e23c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:56:27 np0005546954 systemd-udevd[213820]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:56:27 np0005546954 nova_compute[187160]: 2025-12-05 12:56:27.150 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:56:27.152 104428 INFO neutron.agent.ovn.metadata.agent [-] Port 74c7c063-9c46-4bde-8e16-cab6f4e2e23c in datapath d4389bc8-2898-48b0-9741-5183b54fe83c bound to our chassis#033[00m
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:56:27.154 104428 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d4389bc8-2898-48b0-9741-5183b54fe83c#033[00m
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:56:27.167 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[568572e4-5055-4918-87ca-5986867c2880]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:56:27.169 104428 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd4389bc8-21 in ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:56:27.172 208690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd4389bc8-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:56:27 np0005546954 NetworkManager[55665]: <info>  [1764939387.1728] device (tap74c7c063-9c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:56:27.172 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[a03c3fd4-9f36-474c-8026-53fd93a9857d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:56:27.173 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[538096c1-17b4-438d-a7d1-1f9ef853dfb9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:56:27 np0005546954 NetworkManager[55665]: <info>  [1764939387.1745] device (tap74c7c063-9c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:56:27 np0005546954 systemd-machined[153497]: New machine qemu-14-instance-00000010.
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:56:27.188 104542 DEBUG oslo.privsep.daemon [-] privsep: reply[16889e22-9367-4666-86eb-ac1d3f99180b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:56:27 np0005546954 systemd[1]: Started Virtual Machine qemu-14-instance-00000010.
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:56:27.206 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[a729446e-8bcd-485e-9a00-424c8cd7a2d6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:56:27.242 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[0f7c81d4-a229-431e-afb8-82ec8b9fb122]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:56:27.247 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[a05c469f-a14c-47a6-b235-24d66cd12f43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:56:27 np0005546954 NetworkManager[55665]: <info>  [1764939387.2487] manager: (tapd4389bc8-20): new Veth device (/org/freedesktop/NetworkManager/Devices/63)
Dec  5 07:56:27 np0005546954 systemd-udevd[213825]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:56:27.288 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[ef5c98f2-718c-4b9f-9e7f-a72dcad33622]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:56:27.292 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[1639b98f-3f80-4ecf-9b90-b9ec47ff54ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:56:27 np0005546954 NetworkManager[55665]: <info>  [1764939387.3328] device (tapd4389bc8-20): carrier: link connected
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:56:27.343 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[f7b2e1b6-ebe3-4e35-a8fd-98b389612453]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:56:27.366 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[e6f3d186-d685-45c3-b955-a744827916f3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd4389bc8-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:43:f7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 43], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 439395, 'reachable_time': 38389, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213856, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:56:27.381 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[746df8d9-7ce3-41f5-a3b3-ec98c7bd47f7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7c:43f7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 439395, 'tstamp': 439395}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213858, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:56:27.398 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[b3700b88-e6d1-41f8-9ce8-34a139a7896f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd4389bc8-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:43:f7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 43], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 439395, 'reachable_time': 38389, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213859, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:56:27.428 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[bd3556ae-738e-4363-91c7-e1922640dd42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:56:27.490 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[23f044f3-cedf-411c-8611-66095a51925f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:56:27.491 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4389bc8-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:56:27.491 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:56:27.492 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd4389bc8-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:56:27 np0005546954 NetworkManager[55665]: <info>  [1764939387.4945] manager: (tapd4389bc8-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/64)
Dec  5 07:56:27 np0005546954 kernel: tapd4389bc8-20: entered promiscuous mode
Dec  5 07:56:27 np0005546954 nova_compute[187160]: 2025-12-05 12:56:27.493 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:56:27.497 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd4389bc8-20, col_values=(('external_ids', {'iface-id': '8dbe2af5-9f18-44ca-8f22-66854bcdd596'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:56:27 np0005546954 nova_compute[187160]: 2025-12-05 12:56:27.496 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:56:27 np0005546954 ovn_controller[95566]: 2025-12-05T12:56:27Z|00156|binding|INFO|Releasing lport 8dbe2af5-9f18-44ca-8f22-66854bcdd596 from this chassis (sb_readonly=0)
Dec  5 07:56:27 np0005546954 nova_compute[187160]: 2025-12-05 12:56:27.498 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:56:27.499 104428 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d4389bc8-2898-48b0-9741-5183b54fe83c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d4389bc8-2898-48b0-9741-5183b54fe83c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:56:27.500 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[801f6e7d-598f-48ce-abc6-66dba7c0d42a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:56:27.501 104428 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]: global
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]:    log         /dev/log local0 debug
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]:    log-tag     haproxy-metadata-proxy-d4389bc8-2898-48b0-9741-5183b54fe83c
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]:    user        root
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]:    group       root
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]:    maxconn     1024
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]:    pidfile     /var/lib/neutron/external/pids/d4389bc8-2898-48b0-9741-5183b54fe83c.pid.haproxy
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]:    daemon
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]: 
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]: defaults
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]:    log global
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]:    mode http
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]:    option httplog
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]:    option dontlognull
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]:    option http-server-close
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]:    option forwardfor
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]:    retries                 3
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]:    timeout http-request    30s
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]:    timeout connect         30s
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]:    timeout client          32s
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]:    timeout server          32s
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]:    timeout http-keep-alive 30s
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]: 
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]: 
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]: listen listener
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]:    bind 169.254.169.254:80
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]:    http-request add-header X-OVN-Network-ID d4389bc8-2898-48b0-9741-5183b54fe83c
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:56:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:56:27.501 104428 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'env', 'PROCESS_TAG=haproxy-d4389bc8-2898-48b0-9741-5183b54fe83c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d4389bc8-2898-48b0-9741-5183b54fe83c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:56:27 np0005546954 nova_compute[187160]: 2025-12-05 12:56:27.510 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:56:27 np0005546954 nova_compute[187160]: 2025-12-05 12:56:27.739 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764939387.7392187, 4e3508c1-ecaa-442a-95c5-f5095e12912e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:56:27 np0005546954 nova_compute[187160]: 2025-12-05 12:56:27.740 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] VM Started (Lifecycle Event)#033[00m
Dec  5 07:56:27 np0005546954 nova_compute[187160]: 2025-12-05 12:56:27.767 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:56:27 np0005546954 nova_compute[187160]: 2025-12-05 12:56:27.771 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764939387.7393417, 4e3508c1-ecaa-442a-95c5-f5095e12912e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:56:27 np0005546954 nova_compute[187160]: 2025-12-05 12:56:27.772 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:56:27 np0005546954 nova_compute[187160]: 2025-12-05 12:56:27.808 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:56:27 np0005546954 nova_compute[187160]: 2025-12-05 12:56:27.815 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:56:27 np0005546954 nova_compute[187160]: 2025-12-05 12:56:27.842 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:56:27 np0005546954 nova_compute[187160]: 2025-12-05 12:56:27.903 187164 DEBUG nova.compute.manager [req-739e7cb4-e9e7-4453-b0f2-ee7610c5656a req-67f410fd-712e-461e-bb06-a42f85fa2b58 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Received event network-vif-plugged-74c7c063-9c46-4bde-8e16-cab6f4e2e23c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:56:27 np0005546954 nova_compute[187160]: 2025-12-05 12:56:27.903 187164 DEBUG oslo_concurrency.lockutils [req-739e7cb4-e9e7-4453-b0f2-ee7610c5656a req-67f410fd-712e-461e-bb06-a42f85fa2b58 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "4e3508c1-ecaa-442a-95c5-f5095e12912e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:56:27 np0005546954 nova_compute[187160]: 2025-12-05 12:56:27.903 187164 DEBUG oslo_concurrency.lockutils [req-739e7cb4-e9e7-4453-b0f2-ee7610c5656a req-67f410fd-712e-461e-bb06-a42f85fa2b58 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "4e3508c1-ecaa-442a-95c5-f5095e12912e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:56:27 np0005546954 nova_compute[187160]: 2025-12-05 12:56:27.904 187164 DEBUG oslo_concurrency.lockutils [req-739e7cb4-e9e7-4453-b0f2-ee7610c5656a req-67f410fd-712e-461e-bb06-a42f85fa2b58 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "4e3508c1-ecaa-442a-95c5-f5095e12912e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:56:27 np0005546954 nova_compute[187160]: 2025-12-05 12:56:27.904 187164 DEBUG nova.compute.manager [req-739e7cb4-e9e7-4453-b0f2-ee7610c5656a req-67f410fd-712e-461e-bb06-a42f85fa2b58 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Processing event network-vif-plugged-74c7c063-9c46-4bde-8e16-cab6f4e2e23c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:56:27 np0005546954 nova_compute[187160]: 2025-12-05 12:56:27.904 187164 DEBUG nova.compute.manager [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:56:27 np0005546954 nova_compute[187160]: 2025-12-05 12:56:27.908 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764939387.9084442, 4e3508c1-ecaa-442a-95c5-f5095e12912e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:56:27 np0005546954 nova_compute[187160]: 2025-12-05 12:56:27.909 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:56:27 np0005546954 nova_compute[187160]: 2025-12-05 12:56:27.911 187164 DEBUG nova.virt.libvirt.driver [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:56:27 np0005546954 nova_compute[187160]: 2025-12-05 12:56:27.915 187164 INFO nova.virt.libvirt.driver [-] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Instance spawned successfully.#033[00m
Dec  5 07:56:27 np0005546954 nova_compute[187160]: 2025-12-05 12:56:27.916 187164 DEBUG nova.virt.libvirt.driver [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:56:27 np0005546954 nova_compute[187160]: 2025-12-05 12:56:27.937 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:56:27 np0005546954 nova_compute[187160]: 2025-12-05 12:56:27.945 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:56:27 np0005546954 nova_compute[187160]: 2025-12-05 12:56:27.948 187164 DEBUG nova.virt.libvirt.driver [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:56:27 np0005546954 nova_compute[187160]: 2025-12-05 12:56:27.948 187164 DEBUG nova.virt.libvirt.driver [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:56:27 np0005546954 nova_compute[187160]: 2025-12-05 12:56:27.949 187164 DEBUG nova.virt.libvirt.driver [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:56:27 np0005546954 nova_compute[187160]: 2025-12-05 12:56:27.949 187164 DEBUG nova.virt.libvirt.driver [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:56:27 np0005546954 nova_compute[187160]: 2025-12-05 12:56:27.950 187164 DEBUG nova.virt.libvirt.driver [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:56:27 np0005546954 nova_compute[187160]: 2025-12-05 12:56:27.950 187164 DEBUG nova.virt.libvirt.driver [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:56:27 np0005546954 nova_compute[187160]: 2025-12-05 12:56:27.989 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:56:28 np0005546954 podman[213899]: 2025-12-05 12:56:27.935085463 +0000 UTC m=+0.038529828 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:56:28 np0005546954 nova_compute[187160]: 2025-12-05 12:56:28.032 187164 INFO nova.compute.manager [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Took 7.45 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:56:28 np0005546954 nova_compute[187160]: 2025-12-05 12:56:28.032 187164 DEBUG nova.compute.manager [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:56:28 np0005546954 nova_compute[187160]: 2025-12-05 12:56:28.093 187164 INFO nova.compute.manager [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Took 7.97 seconds to build instance.#033[00m
Dec  5 07:56:28 np0005546954 nova_compute[187160]: 2025-12-05 12:56:28.111 187164 DEBUG oslo_concurrency.lockutils [None req-828a2de7-2e95-44a4-918f-f42f85561a3a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "4e3508c1-ecaa-442a-95c5-f5095e12912e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.086s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:56:28 np0005546954 podman[213899]: 2025-12-05 12:56:28.249284698 +0000 UTC m=+0.352729023 container create 0763f417ec2d2b03742a771bcb9e674c43437e01bafa3936d501bccf48ab32e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec  5 07:56:28 np0005546954 systemd[1]: Started libpod-conmon-0763f417ec2d2b03742a771bcb9e674c43437e01bafa3936d501bccf48ab32e2.scope.
Dec  5 07:56:28 np0005546954 podman[213912]: 2025-12-05 12:56:28.412935116 +0000 UTC m=+0.119742913 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec  5 07:56:28 np0005546954 systemd[1]: Started libcrun container.
Dec  5 07:56:28 np0005546954 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0e7f9e6951f9b999b19879099be21895d424e854cf409f21b137312b3c5698dc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:56:28 np0005546954 podman[213899]: 2025-12-05 12:56:28.490398583 +0000 UTC m=+0.593842958 container init 0763f417ec2d2b03742a771bcb9e674c43437e01bafa3936d501bccf48ab32e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Dec  5 07:56:28 np0005546954 podman[213899]: 2025-12-05 12:56:28.501508981 +0000 UTC m=+0.604953306 container start 0763f417ec2d2b03742a771bcb9e674c43437e01bafa3936d501bccf48ab32e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:56:28 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[213932]: [NOTICE]   (213938) : New worker (213940) forked
Dec  5 07:56:28 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[213932]: [NOTICE]   (213938) : Loading success.
Dec  5 07:56:28 np0005546954 nova_compute[187160]: 2025-12-05 12:56:28.649 187164 DEBUG nova.network.neutron [req-2efdfa9e-9045-4e64-8409-5c63553c6477 req-218daac4-12db-4525-9116-017ad809327d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Updated VIF entry in instance network info cache for port 74c7c063-9c46-4bde-8e16-cab6f4e2e23c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:56:28 np0005546954 nova_compute[187160]: 2025-12-05 12:56:28.651 187164 DEBUG nova.network.neutron [req-2efdfa9e-9045-4e64-8409-5c63553c6477 req-218daac4-12db-4525-9116-017ad809327d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Updating instance_info_cache with network_info: [{"id": "74c7c063-9c46-4bde-8e16-cab6f4e2e23c", "address": "fa:16:3e:83:2c:4d", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74c7c063-9c", "ovs_interfaceid": "74c7c063-9c46-4bde-8e16-cab6f4e2e23c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:56:28 np0005546954 nova_compute[187160]: 2025-12-05 12:56:28.673 187164 DEBUG oslo_concurrency.lockutils [req-2efdfa9e-9045-4e64-8409-5c63553c6477 req-218daac4-12db-4525-9116-017ad809327d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Releasing lock "refresh_cache-4e3508c1-ecaa-442a-95c5-f5095e12912e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:56:29 np0005546954 nova_compute[187160]: 2025-12-05 12:56:29.991 187164 DEBUG nova.compute.manager [req-10e226cb-42fa-4cbf-b864-87213ac6052b req-b3cc658d-1dd8-4686-8855-43e956c84e84 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Received event network-vif-plugged-74c7c063-9c46-4bde-8e16-cab6f4e2e23c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:56:29 np0005546954 nova_compute[187160]: 2025-12-05 12:56:29.991 187164 DEBUG oslo_concurrency.lockutils [req-10e226cb-42fa-4cbf-b864-87213ac6052b req-b3cc658d-1dd8-4686-8855-43e956c84e84 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "4e3508c1-ecaa-442a-95c5-f5095e12912e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:56:29 np0005546954 nova_compute[187160]: 2025-12-05 12:56:29.992 187164 DEBUG oslo_concurrency.lockutils [req-10e226cb-42fa-4cbf-b864-87213ac6052b req-b3cc658d-1dd8-4686-8855-43e956c84e84 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "4e3508c1-ecaa-442a-95c5-f5095e12912e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:56:29 np0005546954 nova_compute[187160]: 2025-12-05 12:56:29.992 187164 DEBUG oslo_concurrency.lockutils [req-10e226cb-42fa-4cbf-b864-87213ac6052b req-b3cc658d-1dd8-4686-8855-43e956c84e84 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "4e3508c1-ecaa-442a-95c5-f5095e12912e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:56:29 np0005546954 nova_compute[187160]: 2025-12-05 12:56:29.992 187164 DEBUG nova.compute.manager [req-10e226cb-42fa-4cbf-b864-87213ac6052b req-b3cc658d-1dd8-4686-8855-43e956c84e84 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] No waiting events found dispatching network-vif-plugged-74c7c063-9c46-4bde-8e16-cab6f4e2e23c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:56:29 np0005546954 nova_compute[187160]: 2025-12-05 12:56:29.992 187164 WARNING nova.compute.manager [req-10e226cb-42fa-4cbf-b864-87213ac6052b req-b3cc658d-1dd8-4686-8855-43e956c84e84 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Received unexpected event network-vif-plugged-74c7c063-9c46-4bde-8e16-cab6f4e2e23c for instance with vm_state active and task_state None.#033[00m
Dec  5 07:56:30 np0005546954 nova_compute[187160]: 2025-12-05 12:56:30.605 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:56:31 np0005546954 nova_compute[187160]: 2025-12-05 12:56:31.072 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:56:33 np0005546954 podman[213952]: 2025-12-05 12:56:33.54273402 +0000 UTC m=+0.053820668 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 07:56:33 np0005546954 podman[213951]: 2025-12-05 12:56:33.570851881 +0000 UTC m=+0.084971954 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:56:35 np0005546954 nova_compute[187160]: 2025-12-05 12:56:35.608 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:56:35 np0005546954 podman[197513]: time="2025-12-05T12:56:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:56:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:56:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  5 07:56:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:56:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3059 "" "Go-http-client/1.1"
Dec  5 07:56:36 np0005546954 nova_compute[187160]: 2025-12-05 12:56:36.074 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:56:40 np0005546954 nova_compute[187160]: 2025-12-05 12:56:40.610 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:56:40 np0005546954 ovn_controller[95566]: 2025-12-05T12:56:40Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:83:2c:4d 10.100.0.10
Dec  5 07:56:40 np0005546954 ovn_controller[95566]: 2025-12-05T12:56:40Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:83:2c:4d 10.100.0.10
Dec  5 07:56:41 np0005546954 nova_compute[187160]: 2025-12-05 12:56:41.075 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:56:45 np0005546954 nova_compute[187160]: 2025-12-05 12:56:45.613 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:56:46 np0005546954 nova_compute[187160]: 2025-12-05 12:56:46.077 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:56:46 np0005546954 podman[214020]: 2025-12-05 12:56:46.608512959 +0000 UTC m=+0.096545717 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.33.7, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, name=ubi9-minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec  5 07:56:46 np0005546954 podman[214021]: 2025-12-05 12:56:46.619531244 +0000 UTC m=+0.103334169 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec  5 07:56:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:56:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:56:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:56:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:56:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:56:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 07:56:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:56:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 07:56:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:56:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:56:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 07:56:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:56:50 np0005546954 nova_compute[187160]: 2025-12-05 12:56:50.617 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:56:51 np0005546954 nova_compute[187160]: 2025-12-05 12:56:51.079 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:56:55 np0005546954 nova_compute[187160]: 2025-12-05 12:56:55.619 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:56:56 np0005546954 nova_compute[187160]: 2025-12-05 12:56:56.082 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:56:57 np0005546954 ovn_controller[95566]: 2025-12-05T12:56:57Z|00157|memory_trim|INFO|Detected inactivity (last active 30008 ms ago): trimming memory
Dec  5 07:56:58 np0005546954 podman[214061]: 2025-12-05 12:56:58.567473626 +0000 UTC m=+0.078261863 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:57:00 np0005546954 nova_compute[187160]: 2025-12-05 12:57:00.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:57:00 np0005546954 nova_compute[187160]: 2025-12-05 12:57:00.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:57:00 np0005546954 nova_compute[187160]: 2025-12-05 12:57:00.041 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:57:00 np0005546954 nova_compute[187160]: 2025-12-05 12:57:00.624 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:00 np0005546954 nova_compute[187160]: 2025-12-05 12:57:00.659 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "refresh_cache-4e3508c1-ecaa-442a-95c5-f5095e12912e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:57:00 np0005546954 nova_compute[187160]: 2025-12-05 12:57:00.659 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquired lock "refresh_cache-4e3508c1-ecaa-442a-95c5-f5095e12912e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:57:00 np0005546954 nova_compute[187160]: 2025-12-05 12:57:00.660 187164 DEBUG nova.network.neutron [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec  5 07:57:00 np0005546954 nova_compute[187160]: 2025-12-05 12:57:00.660 187164 DEBUG nova.objects.instance [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4e3508c1-ecaa-442a-95c5-f5095e12912e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:57:01 np0005546954 nova_compute[187160]: 2025-12-05 12:57:01.084 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:04 np0005546954 podman[214081]: 2025-12-05 12:57:04.58607901 +0000 UTC m=+0.079332267 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec  5 07:57:04 np0005546954 podman[214080]: 2025-12-05 12:57:04.627651523 +0000 UTC m=+0.120825078 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  5 07:57:04 np0005546954 nova_compute[187160]: 2025-12-05 12:57:04.750 187164 DEBUG nova.network.neutron [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Updating instance_info_cache with network_info: [{"id": "74c7c063-9c46-4bde-8e16-cab6f4e2e23c", "address": "fa:16:3e:83:2c:4d", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74c7c063-9c", "ovs_interfaceid": "74c7c063-9c46-4bde-8e16-cab6f4e2e23c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:57:04 np0005546954 nova_compute[187160]: 2025-12-05 12:57:04.814 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Releasing lock "refresh_cache-4e3508c1-ecaa-442a-95c5-f5095e12912e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:57:04 np0005546954 nova_compute[187160]: 2025-12-05 12:57:04.815 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec  5 07:57:04 np0005546954 nova_compute[187160]: 2025-12-05 12:57:04.815 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:57:04 np0005546954 nova_compute[187160]: 2025-12-05 12:57:04.815 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:57:05 np0005546954 nova_compute[187160]: 2025-12-05 12:57:05.627 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:05 np0005546954 podman[197513]: time="2025-12-05T12:57:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:57:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:57:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  5 07:57:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:57:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3055 "" "Go-http-client/1.1"
Dec  5 07:57:06 np0005546954 nova_compute[187160]: 2025-12-05 12:57:06.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:57:06 np0005546954 nova_compute[187160]: 2025-12-05 12:57:06.086 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:07 np0005546954 nova_compute[187160]: 2025-12-05 12:57:07.035 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:57:08 np0005546954 nova_compute[187160]: 2025-12-05 12:57:08.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:57:08 np0005546954 nova_compute[187160]: 2025-12-05 12:57:08.074 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:57:08 np0005546954 nova_compute[187160]: 2025-12-05 12:57:08.074 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:57:08 np0005546954 nova_compute[187160]: 2025-12-05 12:57:08.075 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:57:08 np0005546954 nova_compute[187160]: 2025-12-05 12:57:08.075 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:57:08 np0005546954 nova_compute[187160]: 2025-12-05 12:57:08.180 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e3508c1-ecaa-442a-95c5-f5095e12912e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:57:08 np0005546954 nova_compute[187160]: 2025-12-05 12:57:08.248 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e3508c1-ecaa-442a-95c5-f5095e12912e/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:57:08 np0005546954 nova_compute[187160]: 2025-12-05 12:57:08.249 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e3508c1-ecaa-442a-95c5-f5095e12912e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:57:08 np0005546954 nova_compute[187160]: 2025-12-05 12:57:08.314 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e3508c1-ecaa-442a-95c5-f5095e12912e/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:57:08 np0005546954 nova_compute[187160]: 2025-12-05 12:57:08.540 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:57:08 np0005546954 nova_compute[187160]: 2025-12-05 12:57:08.542 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5679MB free_disk=73.30705642700195GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:57:08 np0005546954 nova_compute[187160]: 2025-12-05 12:57:08.542 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:57:08 np0005546954 nova_compute[187160]: 2025-12-05 12:57:08.542 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:57:08 np0005546954 nova_compute[187160]: 2025-12-05 12:57:08.620 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Instance 4e3508c1-ecaa-442a-95c5-f5095e12912e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:57:08 np0005546954 nova_compute[187160]: 2025-12-05 12:57:08.621 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:57:08 np0005546954 nova_compute[187160]: 2025-12-05 12:57:08.621 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:57:08 np0005546954 nova_compute[187160]: 2025-12-05 12:57:08.686 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:57:08 np0005546954 nova_compute[187160]: 2025-12-05 12:57:08.707 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:57:08 np0005546954 nova_compute[187160]: 2025-12-05 12:57:08.738 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:57:08 np0005546954 nova_compute[187160]: 2025-12-05 12:57:08.738 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.196s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:57:10 np0005546954 nova_compute[187160]: 2025-12-05 12:57:10.630 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:10 np0005546954 nova_compute[187160]: 2025-12-05 12:57:10.738 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:57:11 np0005546954 nova_compute[187160]: 2025-12-05 12:57:11.088 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:11 np0005546954 nova_compute[187160]: 2025-12-05 12:57:11.920 187164 DEBUG nova.virt.libvirt.driver [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 199cbeb6-984a-4e32-8ae3-766207e89849] Creating tmpfile /var/lib/nova/instances/tmpi4w_1yvp to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Dec  5 07:57:11 np0005546954 nova_compute[187160]: 2025-12-05 12:57:11.922 187164 DEBUG nova.compute.manager [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpi4w_1yvp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Dec  5 07:57:12 np0005546954 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec  5 07:57:13 np0005546954 nova_compute[187160]: 2025-12-05 12:57:13.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:57:13 np0005546954 nova_compute[187160]: 2025-12-05 12:57:13.041 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:57:13 np0005546954 nova_compute[187160]: 2025-12-05 12:57:13.384 187164 DEBUG nova.compute.manager [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpi4w_1yvp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='199cbeb6-984a-4e32-8ae3-766207e89849',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Dec  5 07:57:13 np0005546954 nova_compute[187160]: 2025-12-05 12:57:13.422 187164 DEBUG oslo_concurrency.lockutils [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "refresh_cache-199cbeb6-984a-4e32-8ae3-766207e89849" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:57:13 np0005546954 nova_compute[187160]: 2025-12-05 12:57:13.423 187164 DEBUG oslo_concurrency.lockutils [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquired lock "refresh_cache-199cbeb6-984a-4e32-8ae3-766207e89849" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:57:13 np0005546954 nova_compute[187160]: 2025-12-05 12:57:13.423 187164 DEBUG nova.network.neutron [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 199cbeb6-984a-4e32-8ae3-766207e89849] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:57:14 np0005546954 nova_compute[187160]: 2025-12-05 12:57:14.035 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:57:14 np0005546954 nova_compute[187160]: 2025-12-05 12:57:14.066 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:57:15 np0005546954 nova_compute[187160]: 2025-12-05 12:57:15.634 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:15 np0005546954 nova_compute[187160]: 2025-12-05 12:57:15.689 187164 DEBUG nova.network.neutron [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 199cbeb6-984a-4e32-8ae3-766207e89849] Updating instance_info_cache with network_info: [{"id": "2b67cffe-9fbb-422c-86d7-1d5a1b3eabe6", "address": "fa:16:3e:f8:bd:3b", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b67cffe-9f", "ovs_interfaceid": "2b67cffe-9fbb-422c-86d7-1d5a1b3eabe6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:57:15 np0005546954 nova_compute[187160]: 2025-12-05 12:57:15.733 187164 DEBUG oslo_concurrency.lockutils [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Releasing lock "refresh_cache-199cbeb6-984a-4e32-8ae3-766207e89849" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:57:15 np0005546954 nova_compute[187160]: 2025-12-05 12:57:15.736 187164 DEBUG nova.virt.libvirt.driver [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 199cbeb6-984a-4e32-8ae3-766207e89849] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpi4w_1yvp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='199cbeb6-984a-4e32-8ae3-766207e89849',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Dec  5 07:57:15 np0005546954 nova_compute[187160]: 2025-12-05 12:57:15.737 187164 DEBUG nova.virt.libvirt.driver [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 199cbeb6-984a-4e32-8ae3-766207e89849] Creating instance directory: /var/lib/nova/instances/199cbeb6-984a-4e32-8ae3-766207e89849 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Dec  5 07:57:15 np0005546954 nova_compute[187160]: 2025-12-05 12:57:15.738 187164 DEBUG nova.virt.libvirt.driver [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 199cbeb6-984a-4e32-8ae3-766207e89849] Creating disk.info with the contents: {'/var/lib/nova/instances/199cbeb6-984a-4e32-8ae3-766207e89849/disk': 'qcow2', '/var/lib/nova/instances/199cbeb6-984a-4e32-8ae3-766207e89849/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Dec  5 07:57:15 np0005546954 nova_compute[187160]: 2025-12-05 12:57:15.738 187164 DEBUG nova.virt.libvirt.driver [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 199cbeb6-984a-4e32-8ae3-766207e89849] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Dec  5 07:57:15 np0005546954 nova_compute[187160]: 2025-12-05 12:57:15.739 187164 DEBUG nova.objects.instance [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lazy-loading 'trusted_certs' on Instance uuid 199cbeb6-984a-4e32-8ae3-766207e89849 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:57:15 np0005546954 nova_compute[187160]: 2025-12-05 12:57:15.778 187164 DEBUG oslo_concurrency.processutils [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:57:15 np0005546954 nova_compute[187160]: 2025-12-05 12:57:15.877 187164 DEBUG oslo_concurrency.processutils [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:57:15 np0005546954 nova_compute[187160]: 2025-12-05 12:57:15.878 187164 DEBUG oslo_concurrency.lockutils [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:57:15 np0005546954 nova_compute[187160]: 2025-12-05 12:57:15.879 187164 DEBUG oslo_concurrency.lockutils [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:57:15 np0005546954 nova_compute[187160]: 2025-12-05 12:57:15.893 187164 DEBUG oslo_concurrency.processutils [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:57:15 np0005546954 nova_compute[187160]: 2025-12-05 12:57:15.966 187164 DEBUG oslo_concurrency.processutils [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:57:15 np0005546954 nova_compute[187160]: 2025-12-05 12:57:15.967 187164 DEBUG oslo_concurrency.processutils [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/199cbeb6-984a-4e32-8ae3-766207e89849/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:57:16 np0005546954 nova_compute[187160]: 2025-12-05 12:57:16.003 187164 DEBUG oslo_concurrency.processutils [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/199cbeb6-984a-4e32-8ae3-766207e89849/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:57:16 np0005546954 nova_compute[187160]: 2025-12-05 12:57:16.005 187164 DEBUG oslo_concurrency.lockutils [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:57:16 np0005546954 nova_compute[187160]: 2025-12-05 12:57:16.005 187164 DEBUG oslo_concurrency.processutils [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:57:16 np0005546954 nova_compute[187160]: 2025-12-05 12:57:16.057 187164 DEBUG oslo_concurrency.processutils [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:57:16 np0005546954 nova_compute[187160]: 2025-12-05 12:57:16.058 187164 DEBUG nova.virt.disk.api [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Checking if we can resize image /var/lib/nova/instances/199cbeb6-984a-4e32-8ae3-766207e89849/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:57:16 np0005546954 nova_compute[187160]: 2025-12-05 12:57:16.058 187164 DEBUG oslo_concurrency.processutils [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/199cbeb6-984a-4e32-8ae3-766207e89849/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:57:16 np0005546954 nova_compute[187160]: 2025-12-05 12:57:16.090 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:16 np0005546954 nova_compute[187160]: 2025-12-05 12:57:16.111 187164 DEBUG oslo_concurrency.processutils [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/199cbeb6-984a-4e32-8ae3-766207e89849/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:57:16 np0005546954 nova_compute[187160]: 2025-12-05 12:57:16.112 187164 DEBUG nova.virt.disk.api [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Cannot resize image /var/lib/nova/instances/199cbeb6-984a-4e32-8ae3-766207e89849/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:57:16 np0005546954 nova_compute[187160]: 2025-12-05 12:57:16.113 187164 DEBUG nova.objects.instance [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lazy-loading 'migration_context' on Instance uuid 199cbeb6-984a-4e32-8ae3-766207e89849 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:57:16 np0005546954 nova_compute[187160]: 2025-12-05 12:57:16.127 187164 DEBUG oslo_concurrency.processutils [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/199cbeb6-984a-4e32-8ae3-766207e89849/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:57:16 np0005546954 nova_compute[187160]: 2025-12-05 12:57:16.170 187164 DEBUG oslo_concurrency.processutils [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/199cbeb6-984a-4e32-8ae3-766207e89849/disk.config 485376" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:57:16 np0005546954 nova_compute[187160]: 2025-12-05 12:57:16.173 187164 DEBUG nova.virt.libvirt.volume.remotefs [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/199cbeb6-984a-4e32-8ae3-766207e89849/disk.config to /var/lib/nova/instances/199cbeb6-984a-4e32-8ae3-766207e89849 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Dec  5 07:57:16 np0005546954 nova_compute[187160]: 2025-12-05 12:57:16.173 187164 DEBUG oslo_concurrency.processutils [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/199cbeb6-984a-4e32-8ae3-766207e89849/disk.config /var/lib/nova/instances/199cbeb6-984a-4e32-8ae3-766207e89849 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:57:16 np0005546954 nova_compute[187160]: 2025-12-05 12:57:16.615 187164 DEBUG oslo_concurrency.processutils [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/199cbeb6-984a-4e32-8ae3-766207e89849/disk.config /var/lib/nova/instances/199cbeb6-984a-4e32-8ae3-766207e89849" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:57:16 np0005546954 nova_compute[187160]: 2025-12-05 12:57:16.617 187164 DEBUG nova.virt.libvirt.driver [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 199cbeb6-984a-4e32-8ae3-766207e89849] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Dec  5 07:57:16 np0005546954 nova_compute[187160]: 2025-12-05 12:57:16.619 187164 DEBUG nova.virt.libvirt.vif [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:56:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1436754950',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1436754950',id=15,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:56:12Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e6ae0d0dcde04b85b6dae45560cca988',ramdisk_id='',reservation_id='r-ds6vwl56',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-192029678',owner_user_name='tempest-TestExecuteStrategies-192029678-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:56:12Z,user_data=None,user_id='0ae0bb20ac8b4be99eb1abddc7310436',uuid=199cbeb6-984a-4e32-8ae3-766207e89849,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2b67cffe-9fbb-422c-86d7-1d5a1b3eabe6", "address": "fa:16:3e:f8:bd:3b", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap2b67cffe-9f", "ovs_interfaceid": "2b67cffe-9fbb-422c-86d7-1d5a1b3eabe6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:57:16 np0005546954 nova_compute[187160]: 2025-12-05 12:57:16.620 187164 DEBUG nova.network.os_vif_util [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Converting VIF {"id": "2b67cffe-9fbb-422c-86d7-1d5a1b3eabe6", "address": "fa:16:3e:f8:bd:3b", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap2b67cffe-9f", "ovs_interfaceid": "2b67cffe-9fbb-422c-86d7-1d5a1b3eabe6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:57:16 np0005546954 nova_compute[187160]: 2025-12-05 12:57:16.622 187164 DEBUG nova.network.os_vif_util [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f8:bd:3b,bridge_name='br-int',has_traffic_filtering=True,id=2b67cffe-9fbb-422c-86d7-1d5a1b3eabe6,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b67cffe-9f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:57:16 np0005546954 nova_compute[187160]: 2025-12-05 12:57:16.623 187164 DEBUG os_vif [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f8:bd:3b,bridge_name='br-int',has_traffic_filtering=True,id=2b67cffe-9fbb-422c-86d7-1d5a1b3eabe6,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b67cffe-9f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:57:16 np0005546954 nova_compute[187160]: 2025-12-05 12:57:16.625 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:16 np0005546954 nova_compute[187160]: 2025-12-05 12:57:16.626 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:57:16 np0005546954 nova_compute[187160]: 2025-12-05 12:57:16.627 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:57:16 np0005546954 nova_compute[187160]: 2025-12-05 12:57:16.633 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:16 np0005546954 nova_compute[187160]: 2025-12-05 12:57:16.633 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2b67cffe-9f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:57:16 np0005546954 nova_compute[187160]: 2025-12-05 12:57:16.635 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2b67cffe-9f, col_values=(('external_ids', {'iface-id': '2b67cffe-9fbb-422c-86d7-1d5a1b3eabe6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f8:bd:3b', 'vm-uuid': '199cbeb6-984a-4e32-8ae3-766207e89849'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:57:16 np0005546954 nova_compute[187160]: 2025-12-05 12:57:16.637 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:16 np0005546954 NetworkManager[55665]: <info>  [1764939436.6407] manager: (tap2b67cffe-9f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/65)
Dec  5 07:57:16 np0005546954 nova_compute[187160]: 2025-12-05 12:57:16.641 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:57:16 np0005546954 nova_compute[187160]: 2025-12-05 12:57:16.645 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:16 np0005546954 nova_compute[187160]: 2025-12-05 12:57:16.645 187164 INFO os_vif [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f8:bd:3b,bridge_name='br-int',has_traffic_filtering=True,id=2b67cffe-9fbb-422c-86d7-1d5a1b3eabe6,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b67cffe-9f')#033[00m
Dec  5 07:57:16 np0005546954 nova_compute[187160]: 2025-12-05 12:57:16.646 187164 DEBUG nova.virt.libvirt.driver [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Dec  5 07:57:16 np0005546954 nova_compute[187160]: 2025-12-05 12:57:16.646 187164 DEBUG nova.compute.manager [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpi4w_1yvp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='199cbeb6-984a-4e32-8ae3-766207e89849',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Dec  5 07:57:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:16.957 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:57:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:16.959 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:57:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:16.960 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:57:17 np0005546954 podman[214158]: 2025-12-05 12:57:17.549622436 +0000 UTC m=+0.061858899 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, architecture=x86_64, container_name=openstack_network_exporter)
Dec  5 07:57:17 np0005546954 podman[214159]: 2025-12-05 12:57:17.55998172 +0000 UTC m=+0.067798905 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:57:18 np0005546954 nova_compute[187160]: 2025-12-05 12:57:18.678 187164 DEBUG nova.network.neutron [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 199cbeb6-984a-4e32-8ae3-766207e89849] Port 2b67cffe-9fbb-422c-86d7-1d5a1b3eabe6 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Dec  5 07:57:18 np0005546954 nova_compute[187160]: 2025-12-05 12:57:18.680 187164 DEBUG nova.compute.manager [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpi4w_1yvp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='199cbeb6-984a-4e32-8ae3-766207e89849',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Dec  5 07:57:18 np0005546954 systemd[1]: Starting libvirt proxy daemon...
Dec  5 07:57:18 np0005546954 systemd[1]: Started libvirt proxy daemon.
Dec  5 07:57:19 np0005546954 kernel: tap2b67cffe-9f: entered promiscuous mode
Dec  5 07:57:19 np0005546954 NetworkManager[55665]: <info>  [1764939439.0813] manager: (tap2b67cffe-9f): new Tun device (/org/freedesktop/NetworkManager/Devices/66)
Dec  5 07:57:19 np0005546954 ovn_controller[95566]: 2025-12-05T12:57:19Z|00158|binding|INFO|Claiming lport 2b67cffe-9fbb-422c-86d7-1d5a1b3eabe6 for this additional chassis.
Dec  5 07:57:19 np0005546954 ovn_controller[95566]: 2025-12-05T12:57:19Z|00159|binding|INFO|2b67cffe-9fbb-422c-86d7-1d5a1b3eabe6: Claiming fa:16:3e:f8:bd:3b 10.100.0.8
Dec  5 07:57:19 np0005546954 nova_compute[187160]: 2025-12-05 12:57:19.082 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:19 np0005546954 ovn_controller[95566]: 2025-12-05T12:57:19Z|00160|binding|INFO|Setting lport 2b67cffe-9fbb-422c-86d7-1d5a1b3eabe6 ovn-installed in OVS
Dec  5 07:57:19 np0005546954 nova_compute[187160]: 2025-12-05 12:57:19.104 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:19 np0005546954 nova_compute[187160]: 2025-12-05 12:57:19.108 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:19 np0005546954 systemd-machined[153497]: New machine qemu-15-instance-0000000f.
Dec  5 07:57:19 np0005546954 systemd-udevd[214229]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:57:19 np0005546954 systemd[1]: Started Virtual Machine qemu-15-instance-0000000f.
Dec  5 07:57:19 np0005546954 NetworkManager[55665]: <info>  [1764939439.1548] device (tap2b67cffe-9f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:57:19 np0005546954 NetworkManager[55665]: <info>  [1764939439.1563] device (tap2b67cffe-9f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:57:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:57:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:57:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:57:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:57:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:57:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 07:57:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:57:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 07:57:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:57:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:57:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 07:57:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:57:20 np0005546954 nova_compute[187160]: 2025-12-05 12:57:20.176 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764939440.1756718, 199cbeb6-984a-4e32-8ae3-766207e89849 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:57:20 np0005546954 nova_compute[187160]: 2025-12-05 12:57:20.177 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 199cbeb6-984a-4e32-8ae3-766207e89849] VM Started (Lifecycle Event)#033[00m
Dec  5 07:57:20 np0005546954 nova_compute[187160]: 2025-12-05 12:57:20.198 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 199cbeb6-984a-4e32-8ae3-766207e89849] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:57:20 np0005546954 nova_compute[187160]: 2025-12-05 12:57:20.635 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:20 np0005546954 nova_compute[187160]: 2025-12-05 12:57:20.983 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764939440.9832945, 199cbeb6-984a-4e32-8ae3-766207e89849 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:57:20 np0005546954 nova_compute[187160]: 2025-12-05 12:57:20.984 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 199cbeb6-984a-4e32-8ae3-766207e89849] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:57:21 np0005546954 nova_compute[187160]: 2025-12-05 12:57:21.013 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 199cbeb6-984a-4e32-8ae3-766207e89849] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:57:21 np0005546954 nova_compute[187160]: 2025-12-05 12:57:21.017 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 199cbeb6-984a-4e32-8ae3-766207e89849] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:57:21 np0005546954 nova_compute[187160]: 2025-12-05 12:57:21.035 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 199cbeb6-984a-4e32-8ae3-766207e89849] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Dec  5 07:57:21 np0005546954 nova_compute[187160]: 2025-12-05 12:57:21.637 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:22 np0005546954 ovn_controller[95566]: 2025-12-05T12:57:22Z|00161|binding|INFO|Claiming lport 2b67cffe-9fbb-422c-86d7-1d5a1b3eabe6 for this chassis.
Dec  5 07:57:22 np0005546954 ovn_controller[95566]: 2025-12-05T12:57:22Z|00162|binding|INFO|2b67cffe-9fbb-422c-86d7-1d5a1b3eabe6: Claiming fa:16:3e:f8:bd:3b 10.100.0.8
Dec  5 07:57:22 np0005546954 ovn_controller[95566]: 2025-12-05T12:57:22Z|00163|binding|INFO|Setting lport 2b67cffe-9fbb-422c-86d7-1d5a1b3eabe6 up in Southbound
Dec  5 07:57:22 np0005546954 nova_compute[187160]: 2025-12-05 12:57:22.873 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:22 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:22.872 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2a:56:46', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '32:90:88:ab:74:32'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:57:22 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:22.876 104428 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 07:57:22 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:22.881 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:bd:3b 10.100.0.8'], port_security=['fa:16:3e:f8:bd:3b 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '199cbeb6-984a-4e32-8ae3-766207e89849', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4389bc8-2898-48b0-9741-5183b54fe83c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e6ae0d0dcde04b85b6dae45560cca988', 'neutron:revision_number': '10', 'neutron:security_group_ids': '9ea68f98-ae7c-4c35-bc5a-7c1a27f7e5f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb60c317-acba-4c06-b29b-f7c6c7a5660a, chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=2b67cffe-9fbb-422c-86d7-1d5a1b3eabe6) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:57:22 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:22.884 104428 INFO neutron.agent.ovn.metadata.agent [-] Port 2b67cffe-9fbb-422c-86d7-1d5a1b3eabe6 in datapath d4389bc8-2898-48b0-9741-5183b54fe83c bound to our chassis#033[00m
Dec  5 07:57:22 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:22.888 104428 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d4389bc8-2898-48b0-9741-5183b54fe83c#033[00m
Dec  5 07:57:22 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:22.910 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[649254c8-4609-4c39-aaf9-dd83fb300cef]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:57:22 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:22.950 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[660a0014-82a6-4182-88e6-d59130cf1d26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:57:22 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:22.953 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[93328b1f-c0df-455a-845f-869e004b806f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:57:22 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:22.985 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[35d95896-1937-4ffe-8c88-8ffbe8c328de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:57:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:23.009 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[bf902706-19f5-4ce5-81d0-3e474cfb1059]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd4389bc8-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:43:f7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 6, 'rx_bytes': 826, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 6, 'rx_bytes': 826, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 43], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 439395, 'reachable_time': 38389, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214262, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:57:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:23.035 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[6c121b85-fc92-48d5-a673-72d2f6539756]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd4389bc8-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 439407, 'tstamp': 439407}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214263, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd4389bc8-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 439411, 'tstamp': 439411}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214263, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:57:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:23.039 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4389bc8-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:57:23 np0005546954 nova_compute[187160]: 2025-12-05 12:57:23.042 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:23.043 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd4389bc8-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:57:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:23.043 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:57:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:23.044 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd4389bc8-20, col_values=(('external_ids', {'iface-id': '8dbe2af5-9f18-44ca-8f22-66854bcdd596'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:57:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:23.045 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:57:23 np0005546954 nova_compute[187160]: 2025-12-05 12:57:23.058 187164 INFO nova.compute.manager [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 199cbeb6-984a-4e32-8ae3-766207e89849] Post operation of migration started#033[00m
Dec  5 07:57:23 np0005546954 nova_compute[187160]: 2025-12-05 12:57:23.903 187164 DEBUG oslo_concurrency.lockutils [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "refresh_cache-199cbeb6-984a-4e32-8ae3-766207e89849" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:57:23 np0005546954 nova_compute[187160]: 2025-12-05 12:57:23.903 187164 DEBUG oslo_concurrency.lockutils [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquired lock "refresh_cache-199cbeb6-984a-4e32-8ae3-766207e89849" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:57:23 np0005546954 nova_compute[187160]: 2025-12-05 12:57:23.904 187164 DEBUG nova.network.neutron [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 199cbeb6-984a-4e32-8ae3-766207e89849] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:57:25 np0005546954 nova_compute[187160]: 2025-12-05 12:57:25.637 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:25 np0005546954 nova_compute[187160]: 2025-12-05 12:57:25.711 187164 DEBUG nova.network.neutron [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 199cbeb6-984a-4e32-8ae3-766207e89849] Updating instance_info_cache with network_info: [{"id": "2b67cffe-9fbb-422c-86d7-1d5a1b3eabe6", "address": "fa:16:3e:f8:bd:3b", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b67cffe-9f", "ovs_interfaceid": "2b67cffe-9fbb-422c-86d7-1d5a1b3eabe6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:57:25 np0005546954 nova_compute[187160]: 2025-12-05 12:57:25.760 187164 DEBUG oslo_concurrency.lockutils [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Releasing lock "refresh_cache-199cbeb6-984a-4e32-8ae3-766207e89849" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:57:25 np0005546954 nova_compute[187160]: 2025-12-05 12:57:25.866 187164 DEBUG oslo_concurrency.lockutils [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:57:25 np0005546954 nova_compute[187160]: 2025-12-05 12:57:25.867 187164 DEBUG oslo_concurrency.lockutils [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:57:25 np0005546954 nova_compute[187160]: 2025-12-05 12:57:25.868 187164 DEBUG oslo_concurrency.lockutils [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:57:25 np0005546954 nova_compute[187160]: 2025-12-05 12:57:25.873 187164 INFO nova.virt.libvirt.driver [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 199cbeb6-984a-4e32-8ae3-766207e89849] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Dec  5 07:57:25 np0005546954 virtqemud[186730]: Domain id=15 name='instance-0000000f' uuid=199cbeb6-984a-4e32-8ae3-766207e89849 is tainted: custom-monitor
Dec  5 07:57:26 np0005546954 nova_compute[187160]: 2025-12-05 12:57:26.639 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:26 np0005546954 nova_compute[187160]: 2025-12-05 12:57:26.880 187164 INFO nova.virt.libvirt.driver [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 199cbeb6-984a-4e32-8ae3-766207e89849] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Dec  5 07:57:27 np0005546954 nova_compute[187160]: 2025-12-05 12:57:27.886 187164 INFO nova.virt.libvirt.driver [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 199cbeb6-984a-4e32-8ae3-766207e89849] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Dec  5 07:57:27 np0005546954 nova_compute[187160]: 2025-12-05 12:57:27.892 187164 DEBUG nova.compute.manager [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 199cbeb6-984a-4e32-8ae3-766207e89849] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:57:28 np0005546954 nova_compute[187160]: 2025-12-05 12:57:28.269 187164 DEBUG nova.objects.instance [None req-98b1dddd-86aa-487c-91b4-9a681ddbf8f1 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 199cbeb6-984a-4e32-8ae3-766207e89849] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Dec  5 07:57:29 np0005546954 podman[214268]: 2025-12-05 12:57:29.587468335 +0000 UTC m=+0.085627175 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec  5 07:57:30 np0005546954 nova_compute[187160]: 2025-12-05 12:57:30.639 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:31 np0005546954 nova_compute[187160]: 2025-12-05 12:57:31.730 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:31 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:31.880 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f9f74c-08f9-451f-9678-93bb9e8fa2fe, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:57:33 np0005546954 nova_compute[187160]: 2025-12-05 12:57:33.122 187164 DEBUG oslo_concurrency.lockutils [None req-39b7da9e-42a3-4598-bcfc-54f308569dd6 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "4e3508c1-ecaa-442a-95c5-f5095e12912e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:57:33 np0005546954 nova_compute[187160]: 2025-12-05 12:57:33.122 187164 DEBUG oslo_concurrency.lockutils [None req-39b7da9e-42a3-4598-bcfc-54f308569dd6 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "4e3508c1-ecaa-442a-95c5-f5095e12912e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:57:33 np0005546954 nova_compute[187160]: 2025-12-05 12:57:33.123 187164 DEBUG oslo_concurrency.lockutils [None req-39b7da9e-42a3-4598-bcfc-54f308569dd6 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "4e3508c1-ecaa-442a-95c5-f5095e12912e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:57:33 np0005546954 nova_compute[187160]: 2025-12-05 12:57:33.123 187164 DEBUG oslo_concurrency.lockutils [None req-39b7da9e-42a3-4598-bcfc-54f308569dd6 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "4e3508c1-ecaa-442a-95c5-f5095e12912e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:57:33 np0005546954 nova_compute[187160]: 2025-12-05 12:57:33.123 187164 DEBUG oslo_concurrency.lockutils [None req-39b7da9e-42a3-4598-bcfc-54f308569dd6 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "4e3508c1-ecaa-442a-95c5-f5095e12912e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:57:33 np0005546954 nova_compute[187160]: 2025-12-05 12:57:33.124 187164 INFO nova.compute.manager [None req-39b7da9e-42a3-4598-bcfc-54f308569dd6 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Terminating instance#033[00m
Dec  5 07:57:33 np0005546954 nova_compute[187160]: 2025-12-05 12:57:33.126 187164 DEBUG nova.compute.manager [None req-39b7da9e-42a3-4598-bcfc-54f308569dd6 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:57:33 np0005546954 kernel: tap74c7c063-9c (unregistering): left promiscuous mode
Dec  5 07:57:33 np0005546954 NetworkManager[55665]: <info>  [1764939453.1577] device (tap74c7c063-9c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:57:33 np0005546954 ovn_controller[95566]: 2025-12-05T12:57:33Z|00164|binding|INFO|Releasing lport 74c7c063-9c46-4bde-8e16-cab6f4e2e23c from this chassis (sb_readonly=0)
Dec  5 07:57:33 np0005546954 ovn_controller[95566]: 2025-12-05T12:57:33Z|00165|binding|INFO|Setting lport 74c7c063-9c46-4bde-8e16-cab6f4e2e23c down in Southbound
Dec  5 07:57:33 np0005546954 ovn_controller[95566]: 2025-12-05T12:57:33Z|00166|binding|INFO|Removing iface tap74c7c063-9c ovn-installed in OVS
Dec  5 07:57:33 np0005546954 nova_compute[187160]: 2025-12-05 12:57:33.172 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:33 np0005546954 nova_compute[187160]: 2025-12-05 12:57:33.176 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:33 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:33.182 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:83:2c:4d 10.100.0.10'], port_security=['fa:16:3e:83:2c:4d 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '4e3508c1-ecaa-442a-95c5-f5095e12912e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4389bc8-2898-48b0-9741-5183b54fe83c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e6ae0d0dcde04b85b6dae45560cca988', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9ea68f98-ae7c-4c35-bc5a-7c1a27f7e5f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb60c317-acba-4c06-b29b-f7c6c7a5660a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=74c7c063-9c46-4bde-8e16-cab6f4e2e23c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:57:33 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:33.185 104428 INFO neutron.agent.ovn.metadata.agent [-] Port 74c7c063-9c46-4bde-8e16-cab6f4e2e23c in datapath d4389bc8-2898-48b0-9741-5183b54fe83c unbound from our chassis#033[00m
Dec  5 07:57:33 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:33.187 104428 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d4389bc8-2898-48b0-9741-5183b54fe83c#033[00m
Dec  5 07:57:33 np0005546954 nova_compute[187160]: 2025-12-05 12:57:33.191 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:33 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:33.210 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[814623f2-77f1-4ed8-aa98-cfa0e740b5da]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:57:33 np0005546954 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000010.scope: Deactivated successfully.
Dec  5 07:57:33 np0005546954 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000010.scope: Consumed 15.359s CPU time.
Dec  5 07:57:33 np0005546954 systemd-machined[153497]: Machine qemu-14-instance-00000010 terminated.
Dec  5 07:57:33 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:33.249 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[17261dc7-dc94-440e-88e6-a63c84b0cb6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:57:33 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:33.253 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[af832655-6bb7-46ed-ae0c-2aaa33fb9ee1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:57:33 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:33.295 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[14e98cc6-79ec-432f-b8b1-35d06aab7cef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:57:33 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:33.320 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[4d37fa78-25b9-4abb-b300-3117511ca279]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd4389bc8-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:43:f7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 8, 'rx_bytes': 1456, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 8, 'rx_bytes': 1456, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 43], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 439395, 'reachable_time': 38389, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214297, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:57:33 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:33.333 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[e80c7a4e-135e-4115-80ac-1fea8795d6d0]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd4389bc8-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 439407, 'tstamp': 439407}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214298, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd4389bc8-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 439411, 'tstamp': 439411}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214298, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:57:33 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:33.334 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4389bc8-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:57:33 np0005546954 nova_compute[187160]: 2025-12-05 12:57:33.335 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:33 np0005546954 nova_compute[187160]: 2025-12-05 12:57:33.341 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:33 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:33.341 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd4389bc8-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:57:33 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:33.341 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:57:33 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:33.342 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd4389bc8-20, col_values=(('external_ids', {'iface-id': '8dbe2af5-9f18-44ca-8f22-66854bcdd596'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:57:33 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:33.342 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:57:33 np0005546954 nova_compute[187160]: 2025-12-05 12:57:33.391 187164 INFO nova.virt.libvirt.driver [-] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Instance destroyed successfully.#033[00m
Dec  5 07:57:33 np0005546954 nova_compute[187160]: 2025-12-05 12:57:33.391 187164 DEBUG nova.objects.instance [None req-39b7da9e-42a3-4598-bcfc-54f308569dd6 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lazy-loading 'resources' on Instance uuid 4e3508c1-ecaa-442a-95c5-f5095e12912e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:57:33 np0005546954 nova_compute[187160]: 2025-12-05 12:57:33.411 187164 DEBUG nova.virt.libvirt.vif [None req-39b7da9e-42a3-4598-bcfc-54f308569dd6 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:56:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-456483940',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-456483940',id=16,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:56:28Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e6ae0d0dcde04b85b6dae45560cca988',ramdisk_id='',reservation_id='r-u55jv8k1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-192029678',owner_user_name='tempest-TestExecuteStrategies-192029678-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:56:28Z,user_data=None,user_id='0ae0bb20ac8b4be99eb1abddc7310436',uuid=4e3508c1-ecaa-442a-95c5-f5095e12912e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "74c7c063-9c46-4bde-8e16-cab6f4e2e23c", "address": "fa:16:3e:83:2c:4d", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74c7c063-9c", "ovs_interfaceid": "74c7c063-9c46-4bde-8e16-cab6f4e2e23c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:57:33 np0005546954 nova_compute[187160]: 2025-12-05 12:57:33.411 187164 DEBUG nova.network.os_vif_util [None req-39b7da9e-42a3-4598-bcfc-54f308569dd6 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converting VIF {"id": "74c7c063-9c46-4bde-8e16-cab6f4e2e23c", "address": "fa:16:3e:83:2c:4d", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74c7c063-9c", "ovs_interfaceid": "74c7c063-9c46-4bde-8e16-cab6f4e2e23c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:57:33 np0005546954 nova_compute[187160]: 2025-12-05 12:57:33.412 187164 DEBUG nova.network.os_vif_util [None req-39b7da9e-42a3-4598-bcfc-54f308569dd6 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:83:2c:4d,bridge_name='br-int',has_traffic_filtering=True,id=74c7c063-9c46-4bde-8e16-cab6f4e2e23c,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74c7c063-9c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:57:33 np0005546954 nova_compute[187160]: 2025-12-05 12:57:33.412 187164 DEBUG os_vif [None req-39b7da9e-42a3-4598-bcfc-54f308569dd6 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:83:2c:4d,bridge_name='br-int',has_traffic_filtering=True,id=74c7c063-9c46-4bde-8e16-cab6f4e2e23c,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74c7c063-9c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:57:33 np0005546954 nova_compute[187160]: 2025-12-05 12:57:33.414 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:33 np0005546954 nova_compute[187160]: 2025-12-05 12:57:33.414 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap74c7c063-9c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:57:33 np0005546954 nova_compute[187160]: 2025-12-05 12:57:33.416 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:33 np0005546954 nova_compute[187160]: 2025-12-05 12:57:33.418 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:57:33 np0005546954 nova_compute[187160]: 2025-12-05 12:57:33.418 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:33 np0005546954 nova_compute[187160]: 2025-12-05 12:57:33.421 187164 INFO os_vif [None req-39b7da9e-42a3-4598-bcfc-54f308569dd6 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:83:2c:4d,bridge_name='br-int',has_traffic_filtering=True,id=74c7c063-9c46-4bde-8e16-cab6f4e2e23c,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74c7c063-9c')#033[00m
Dec  5 07:57:33 np0005546954 nova_compute[187160]: 2025-12-05 12:57:33.421 187164 INFO nova.virt.libvirt.driver [None req-39b7da9e-42a3-4598-bcfc-54f308569dd6 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Deleting instance files /var/lib/nova/instances/4e3508c1-ecaa-442a-95c5-f5095e12912e_del#033[00m
Dec  5 07:57:33 np0005546954 nova_compute[187160]: 2025-12-05 12:57:33.422 187164 INFO nova.virt.libvirt.driver [None req-39b7da9e-42a3-4598-bcfc-54f308569dd6 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Deletion of /var/lib/nova/instances/4e3508c1-ecaa-442a-95c5-f5095e12912e_del complete#033[00m
Dec  5 07:57:33 np0005546954 nova_compute[187160]: 2025-12-05 12:57:33.490 187164 INFO nova.compute.manager [None req-39b7da9e-42a3-4598-bcfc-54f308569dd6 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:57:33 np0005546954 nova_compute[187160]: 2025-12-05 12:57:33.490 187164 DEBUG oslo.service.loopingcall [None req-39b7da9e-42a3-4598-bcfc-54f308569dd6 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:57:33 np0005546954 nova_compute[187160]: 2025-12-05 12:57:33.491 187164 DEBUG nova.compute.manager [-] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:57:33 np0005546954 nova_compute[187160]: 2025-12-05 12:57:33.491 187164 DEBUG nova.network.neutron [-] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:57:33 np0005546954 nova_compute[187160]: 2025-12-05 12:57:33.885 187164 DEBUG nova.compute.manager [req-d9b2aafb-3aeb-47a9-bec6-9b75bc4c6b8a req-6ccb7d0c-b1cf-4f91-a43b-b5c14849d26a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Received event network-vif-unplugged-74c7c063-9c46-4bde-8e16-cab6f4e2e23c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:57:33 np0005546954 nova_compute[187160]: 2025-12-05 12:57:33.886 187164 DEBUG oslo_concurrency.lockutils [req-d9b2aafb-3aeb-47a9-bec6-9b75bc4c6b8a req-6ccb7d0c-b1cf-4f91-a43b-b5c14849d26a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "4e3508c1-ecaa-442a-95c5-f5095e12912e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:57:33 np0005546954 nova_compute[187160]: 2025-12-05 12:57:33.886 187164 DEBUG oslo_concurrency.lockutils [req-d9b2aafb-3aeb-47a9-bec6-9b75bc4c6b8a req-6ccb7d0c-b1cf-4f91-a43b-b5c14849d26a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "4e3508c1-ecaa-442a-95c5-f5095e12912e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:57:33 np0005546954 nova_compute[187160]: 2025-12-05 12:57:33.887 187164 DEBUG oslo_concurrency.lockutils [req-d9b2aafb-3aeb-47a9-bec6-9b75bc4c6b8a req-6ccb7d0c-b1cf-4f91-a43b-b5c14849d26a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "4e3508c1-ecaa-442a-95c5-f5095e12912e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:57:33 np0005546954 nova_compute[187160]: 2025-12-05 12:57:33.887 187164 DEBUG nova.compute.manager [req-d9b2aafb-3aeb-47a9-bec6-9b75bc4c6b8a req-6ccb7d0c-b1cf-4f91-a43b-b5c14849d26a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] No waiting events found dispatching network-vif-unplugged-74c7c063-9c46-4bde-8e16-cab6f4e2e23c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:57:33 np0005546954 nova_compute[187160]: 2025-12-05 12:57:33.887 187164 DEBUG nova.compute.manager [req-d9b2aafb-3aeb-47a9-bec6-9b75bc4c6b8a req-6ccb7d0c-b1cf-4f91-a43b-b5c14849d26a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Received event network-vif-unplugged-74c7c063-9c46-4bde-8e16-cab6f4e2e23c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  5 07:57:34 np0005546954 nova_compute[187160]: 2025-12-05 12:57:34.182 187164 DEBUG nova.network.neutron [-] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:57:34 np0005546954 nova_compute[187160]: 2025-12-05 12:57:34.201 187164 INFO nova.compute.manager [-] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Took 0.71 seconds to deallocate network for instance.#033[00m
Dec  5 07:57:34 np0005546954 nova_compute[187160]: 2025-12-05 12:57:34.256 187164 DEBUG oslo_concurrency.lockutils [None req-39b7da9e-42a3-4598-bcfc-54f308569dd6 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:57:34 np0005546954 nova_compute[187160]: 2025-12-05 12:57:34.257 187164 DEBUG oslo_concurrency.lockutils [None req-39b7da9e-42a3-4598-bcfc-54f308569dd6 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:57:34 np0005546954 nova_compute[187160]: 2025-12-05 12:57:34.354 187164 DEBUG nova.compute.provider_tree [None req-39b7da9e-42a3-4598-bcfc-54f308569dd6 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:57:34 np0005546954 nova_compute[187160]: 2025-12-05 12:57:34.379 187164 DEBUG nova.scheduler.client.report [None req-39b7da9e-42a3-4598-bcfc-54f308569dd6 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:57:34 np0005546954 nova_compute[187160]: 2025-12-05 12:57:34.408 187164 DEBUG oslo_concurrency.lockutils [None req-39b7da9e-42a3-4598-bcfc-54f308569dd6 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:57:34 np0005546954 nova_compute[187160]: 2025-12-05 12:57:34.438 187164 INFO nova.scheduler.client.report [None req-39b7da9e-42a3-4598-bcfc-54f308569dd6 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Deleted allocations for instance 4e3508c1-ecaa-442a-95c5-f5095e12912e#033[00m
Dec  5 07:57:34 np0005546954 nova_compute[187160]: 2025-12-05 12:57:34.546 187164 DEBUG oslo_concurrency.lockutils [None req-39b7da9e-42a3-4598-bcfc-54f308569dd6 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "4e3508c1-ecaa-442a-95c5-f5095e12912e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.423s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:57:35 np0005546954 nova_compute[187160]: 2025-12-05 12:57:35.135 187164 DEBUG oslo_concurrency.lockutils [None req-2a67a345-64e8-4e94-a94b-091929ac203a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "199cbeb6-984a-4e32-8ae3-766207e89849" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:57:35 np0005546954 nova_compute[187160]: 2025-12-05 12:57:35.135 187164 DEBUG oslo_concurrency.lockutils [None req-2a67a345-64e8-4e94-a94b-091929ac203a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "199cbeb6-984a-4e32-8ae3-766207e89849" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:57:35 np0005546954 nova_compute[187160]: 2025-12-05 12:57:35.136 187164 DEBUG oslo_concurrency.lockutils [None req-2a67a345-64e8-4e94-a94b-091929ac203a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "199cbeb6-984a-4e32-8ae3-766207e89849-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:57:35 np0005546954 nova_compute[187160]: 2025-12-05 12:57:35.136 187164 DEBUG oslo_concurrency.lockutils [None req-2a67a345-64e8-4e94-a94b-091929ac203a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "199cbeb6-984a-4e32-8ae3-766207e89849-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:57:35 np0005546954 nova_compute[187160]: 2025-12-05 12:57:35.136 187164 DEBUG oslo_concurrency.lockutils [None req-2a67a345-64e8-4e94-a94b-091929ac203a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "199cbeb6-984a-4e32-8ae3-766207e89849-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:57:35 np0005546954 nova_compute[187160]: 2025-12-05 12:57:35.138 187164 INFO nova.compute.manager [None req-2a67a345-64e8-4e94-a94b-091929ac203a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 199cbeb6-984a-4e32-8ae3-766207e89849] Terminating instance#033[00m
Dec  5 07:57:35 np0005546954 nova_compute[187160]: 2025-12-05 12:57:35.139 187164 DEBUG nova.compute.manager [None req-2a67a345-64e8-4e94-a94b-091929ac203a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 199cbeb6-984a-4e32-8ae3-766207e89849] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:57:35 np0005546954 kernel: tap2b67cffe-9f (unregistering): left promiscuous mode
Dec  5 07:57:35 np0005546954 NetworkManager[55665]: <info>  [1764939455.1683] device (tap2b67cffe-9f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:57:35 np0005546954 ovn_controller[95566]: 2025-12-05T12:57:35Z|00167|binding|INFO|Releasing lport 2b67cffe-9fbb-422c-86d7-1d5a1b3eabe6 from this chassis (sb_readonly=0)
Dec  5 07:57:35 np0005546954 ovn_controller[95566]: 2025-12-05T12:57:35Z|00168|binding|INFO|Setting lport 2b67cffe-9fbb-422c-86d7-1d5a1b3eabe6 down in Southbound
Dec  5 07:57:35 np0005546954 ovn_controller[95566]: 2025-12-05T12:57:35Z|00169|binding|INFO|Removing iface tap2b67cffe-9f ovn-installed in OVS
Dec  5 07:57:35 np0005546954 nova_compute[187160]: 2025-12-05 12:57:35.170 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:35.180 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:bd:3b 10.100.0.8'], port_security=['fa:16:3e:f8:bd:3b 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '199cbeb6-984a-4e32-8ae3-766207e89849', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4389bc8-2898-48b0-9741-5183b54fe83c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e6ae0d0dcde04b85b6dae45560cca988', 'neutron:revision_number': '12', 'neutron:security_group_ids': '9ea68f98-ae7c-4c35-bc5a-7c1a27f7e5f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb60c317-acba-4c06-b29b-f7c6c7a5660a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=2b67cffe-9fbb-422c-86d7-1d5a1b3eabe6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:57:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:35.181 104428 INFO neutron.agent.ovn.metadata.agent [-] Port 2b67cffe-9fbb-422c-86d7-1d5a1b3eabe6 in datapath d4389bc8-2898-48b0-9741-5183b54fe83c unbound from our chassis#033[00m
Dec  5 07:57:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:35.184 104428 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d4389bc8-2898-48b0-9741-5183b54fe83c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:57:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:35.185 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[09fef2a2-f745-4533-88b0-237f8e2d8762]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:57:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:35.186 104428 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c namespace which is not needed anymore#033[00m
Dec  5 07:57:35 np0005546954 nova_compute[187160]: 2025-12-05 12:57:35.204 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:35 np0005546954 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d0000000f.scope: Deactivated successfully.
Dec  5 07:57:35 np0005546954 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d0000000f.scope: Consumed 2.221s CPU time.
Dec  5 07:57:35 np0005546954 systemd-machined[153497]: Machine qemu-15-instance-0000000f terminated.
Dec  5 07:57:35 np0005546954 podman[214328]: 2025-12-05 12:57:35.301181645 +0000 UTC m=+0.069892950 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 07:57:35 np0005546954 podman[214324]: 2025-12-05 12:57:35.3302209 +0000 UTC m=+0.099189623 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:57:35 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[213932]: [NOTICE]   (213938) : haproxy version is 2.8.14-c23fe91
Dec  5 07:57:35 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[213932]: [NOTICE]   (213938) : path to executable is /usr/sbin/haproxy
Dec  5 07:57:35 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[213932]: [WARNING]  (213938) : Exiting Master process...
Dec  5 07:57:35 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[213932]: [WARNING]  (213938) : Exiting Master process...
Dec  5 07:57:35 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[213932]: [ALERT]    (213938) : Current worker (213940) exited with code 143 (Terminated)
Dec  5 07:57:35 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[213932]: [WARNING]  (213938) : All workers exited. Exiting... (0)
Dec  5 07:57:35 np0005546954 systemd[1]: libpod-0763f417ec2d2b03742a771bcb9e674c43437e01bafa3936d501bccf48ab32e2.scope: Deactivated successfully.
Dec  5 07:57:35 np0005546954 podman[214377]: 2025-12-05 12:57:35.34530638 +0000 UTC m=+0.042547137 container died 0763f417ec2d2b03742a771bcb9e674c43437e01bafa3936d501bccf48ab32e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec  5 07:57:35 np0005546954 NetworkManager[55665]: <info>  [1764939455.3620] manager: (tap2b67cffe-9f): new Tun device (/org/freedesktop/NetworkManager/Devices/67)
Dec  5 07:57:35 np0005546954 nova_compute[187160]: 2025-12-05 12:57:35.364 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:35 np0005546954 nova_compute[187160]: 2025-12-05 12:57:35.369 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:35 np0005546954 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0763f417ec2d2b03742a771bcb9e674c43437e01bafa3936d501bccf48ab32e2-userdata-shm.mount: Deactivated successfully.
Dec  5 07:57:35 np0005546954 systemd[1]: var-lib-containers-storage-overlay-0e7f9e6951f9b999b19879099be21895d424e854cf409f21b137312b3c5698dc-merged.mount: Deactivated successfully.
Dec  5 07:57:35 np0005546954 podman[214377]: 2025-12-05 12:57:35.382725697 +0000 UTC m=+0.079966454 container cleanup 0763f417ec2d2b03742a771bcb9e674c43437e01bafa3936d501bccf48ab32e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec  5 07:57:35 np0005546954 systemd[1]: libpod-conmon-0763f417ec2d2b03742a771bcb9e674c43437e01bafa3936d501bccf48ab32e2.scope: Deactivated successfully.
Dec  5 07:57:35 np0005546954 nova_compute[187160]: 2025-12-05 12:57:35.404 187164 INFO nova.virt.libvirt.driver [-] [instance: 199cbeb6-984a-4e32-8ae3-766207e89849] Instance destroyed successfully.#033[00m
Dec  5 07:57:35 np0005546954 nova_compute[187160]: 2025-12-05 12:57:35.405 187164 DEBUG nova.objects.instance [None req-2a67a345-64e8-4e94-a94b-091929ac203a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lazy-loading 'resources' on Instance uuid 199cbeb6-984a-4e32-8ae3-766207e89849 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:57:35 np0005546954 nova_compute[187160]: 2025-12-05 12:57:35.421 187164 DEBUG nova.virt.libvirt.vif [None req-2a67a345-64e8-4e94-a94b-091929ac203a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T12:56:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1436754950',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1436754950',id=15,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:56:12Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e6ae0d0dcde04b85b6dae45560cca988',ramdisk_id='',reservation_id='r-ds6vwl56',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-192029678',owner_user_name='tempest-TestExecuteStrategies-192029678-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:57:28Z,user_data=None,user_id='0ae0bb20ac8b4be99eb1abddc7310436',uuid=199cbeb6-984a-4e32-8ae3-766207e89849,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2b67cffe-9fbb-422c-86d7-1d5a1b3eabe6", "address": "fa:16:3e:f8:bd:3b", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b67cffe-9f", "ovs_interfaceid": "2b67cffe-9fbb-422c-86d7-1d5a1b3eabe6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:57:35 np0005546954 nova_compute[187160]: 2025-12-05 12:57:35.422 187164 DEBUG nova.network.os_vif_util [None req-2a67a345-64e8-4e94-a94b-091929ac203a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converting VIF {"id": "2b67cffe-9fbb-422c-86d7-1d5a1b3eabe6", "address": "fa:16:3e:f8:bd:3b", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b67cffe-9f", "ovs_interfaceid": "2b67cffe-9fbb-422c-86d7-1d5a1b3eabe6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:57:35 np0005546954 nova_compute[187160]: 2025-12-05 12:57:35.423 187164 DEBUG nova.network.os_vif_util [None req-2a67a345-64e8-4e94-a94b-091929ac203a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f8:bd:3b,bridge_name='br-int',has_traffic_filtering=True,id=2b67cffe-9fbb-422c-86d7-1d5a1b3eabe6,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b67cffe-9f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:57:35 np0005546954 nova_compute[187160]: 2025-12-05 12:57:35.423 187164 DEBUG os_vif [None req-2a67a345-64e8-4e94-a94b-091929ac203a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f8:bd:3b,bridge_name='br-int',has_traffic_filtering=True,id=2b67cffe-9fbb-422c-86d7-1d5a1b3eabe6,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b67cffe-9f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:57:35 np0005546954 nova_compute[187160]: 2025-12-05 12:57:35.425 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:35 np0005546954 nova_compute[187160]: 2025-12-05 12:57:35.426 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2b67cffe-9f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:57:35 np0005546954 nova_compute[187160]: 2025-12-05 12:57:35.428 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:35 np0005546954 nova_compute[187160]: 2025-12-05 12:57:35.429 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:35 np0005546954 nova_compute[187160]: 2025-12-05 12:57:35.431 187164 INFO os_vif [None req-2a67a345-64e8-4e94-a94b-091929ac203a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f8:bd:3b,bridge_name='br-int',has_traffic_filtering=True,id=2b67cffe-9fbb-422c-86d7-1d5a1b3eabe6,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b67cffe-9f')#033[00m
Dec  5 07:57:35 np0005546954 nova_compute[187160]: 2025-12-05 12:57:35.431 187164 INFO nova.virt.libvirt.driver [None req-2a67a345-64e8-4e94-a94b-091929ac203a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 199cbeb6-984a-4e32-8ae3-766207e89849] Deleting instance files /var/lib/nova/instances/199cbeb6-984a-4e32-8ae3-766207e89849_del#033[00m
Dec  5 07:57:35 np0005546954 nova_compute[187160]: 2025-12-05 12:57:35.432 187164 INFO nova.virt.libvirt.driver [None req-2a67a345-64e8-4e94-a94b-091929ac203a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 199cbeb6-984a-4e32-8ae3-766207e89849] Deletion of /var/lib/nova/instances/199cbeb6-984a-4e32-8ae3-766207e89849_del complete#033[00m
Dec  5 07:57:35 np0005546954 podman[214431]: 2025-12-05 12:57:35.471409472 +0000 UTC m=+0.059020401 container remove 0763f417ec2d2b03742a771bcb9e674c43437e01bafa3936d501bccf48ab32e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:57:35 np0005546954 nova_compute[187160]: 2025-12-05 12:57:35.474 187164 INFO nova.compute.manager [None req-2a67a345-64e8-4e94-a94b-091929ac203a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 199cbeb6-984a-4e32-8ae3-766207e89849] Took 0.33 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:57:35 np0005546954 nova_compute[187160]: 2025-12-05 12:57:35.475 187164 DEBUG oslo.service.loopingcall [None req-2a67a345-64e8-4e94-a94b-091929ac203a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:57:35 np0005546954 nova_compute[187160]: 2025-12-05 12:57:35.476 187164 DEBUG nova.compute.manager [-] [instance: 199cbeb6-984a-4e32-8ae3-766207e89849] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:57:35 np0005546954 nova_compute[187160]: 2025-12-05 12:57:35.476 187164 DEBUG nova.network.neutron [-] [instance: 199cbeb6-984a-4e32-8ae3-766207e89849] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:57:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:35.481 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[63461135-d347-46f7-88b4-fd8e4709a22c]: (4, ('Fri Dec  5 12:57:35 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c (0763f417ec2d2b03742a771bcb9e674c43437e01bafa3936d501bccf48ab32e2)\n0763f417ec2d2b03742a771bcb9e674c43437e01bafa3936d501bccf48ab32e2\nFri Dec  5 12:57:35 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c (0763f417ec2d2b03742a771bcb9e674c43437e01bafa3936d501bccf48ab32e2)\n0763f417ec2d2b03742a771bcb9e674c43437e01bafa3936d501bccf48ab32e2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:57:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:35.484 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[4263235d-dbcd-466c-815d-90180d891b74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:57:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:35.485 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4389bc8-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:57:35 np0005546954 nova_compute[187160]: 2025-12-05 12:57:35.486 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:35 np0005546954 kernel: tapd4389bc8-20: left promiscuous mode
Dec  5 07:57:35 np0005546954 nova_compute[187160]: 2025-12-05 12:57:35.488 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:35.493 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[741c38cc-2d05-430b-a1d6-74b066a9acc3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:57:35 np0005546954 nova_compute[187160]: 2025-12-05 12:57:35.499 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:35.519 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[6cd0abe0-4d07-46b0-819a-cd86c4a1ad33]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:57:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:35.521 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[c7b98818-c421-42f4-ace6-ea1f15da57ef]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:57:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:35.536 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[06793bad-02ef-4160-b1e8-344f00845cc1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 439386, 'reachable_time': 26620, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214449, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:57:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:35.541 104542 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:57:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:57:35.541 104542 DEBUG oslo.privsep.daemon [-] privsep: reply[3a5f775a-cc93-4f0c-9150-4f913842967b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:57:35 np0005546954 systemd[1]: run-netns-ovnmeta\x2dd4389bc8\x2d2898\x2d48b0\x2d9741\x2d5183b54fe83c.mount: Deactivated successfully.
Dec  5 07:57:35 np0005546954 nova_compute[187160]: 2025-12-05 12:57:35.642 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:35 np0005546954 podman[197513]: time="2025-12-05T12:57:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:57:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:57:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 07:57:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:57:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2589 "" "Go-http-client/1.1"
Dec  5 07:57:35 np0005546954 nova_compute[187160]: 2025-12-05 12:57:35.905 187164 DEBUG nova.compute.manager [req-7499cdd6-a0fe-48ff-bb20-50d773d13c1b req-245a238e-6529-4c56-bd2d-9ba175775558 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 199cbeb6-984a-4e32-8ae3-766207e89849] Received event network-vif-unplugged-2b67cffe-9fbb-422c-86d7-1d5a1b3eabe6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:57:35 np0005546954 nova_compute[187160]: 2025-12-05 12:57:35.905 187164 DEBUG oslo_concurrency.lockutils [req-7499cdd6-a0fe-48ff-bb20-50d773d13c1b req-245a238e-6529-4c56-bd2d-9ba175775558 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "199cbeb6-984a-4e32-8ae3-766207e89849-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:57:35 np0005546954 nova_compute[187160]: 2025-12-05 12:57:35.906 187164 DEBUG oslo_concurrency.lockutils [req-7499cdd6-a0fe-48ff-bb20-50d773d13c1b req-245a238e-6529-4c56-bd2d-9ba175775558 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "199cbeb6-984a-4e32-8ae3-766207e89849-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:57:35 np0005546954 nova_compute[187160]: 2025-12-05 12:57:35.906 187164 DEBUG oslo_concurrency.lockutils [req-7499cdd6-a0fe-48ff-bb20-50d773d13c1b req-245a238e-6529-4c56-bd2d-9ba175775558 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "199cbeb6-984a-4e32-8ae3-766207e89849-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:57:35 np0005546954 nova_compute[187160]: 2025-12-05 12:57:35.907 187164 DEBUG nova.compute.manager [req-7499cdd6-a0fe-48ff-bb20-50d773d13c1b req-245a238e-6529-4c56-bd2d-9ba175775558 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 199cbeb6-984a-4e32-8ae3-766207e89849] No waiting events found dispatching network-vif-unplugged-2b67cffe-9fbb-422c-86d7-1d5a1b3eabe6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:57:35 np0005546954 nova_compute[187160]: 2025-12-05 12:57:35.907 187164 DEBUG nova.compute.manager [req-7499cdd6-a0fe-48ff-bb20-50d773d13c1b req-245a238e-6529-4c56-bd2d-9ba175775558 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 199cbeb6-984a-4e32-8ae3-766207e89849] Received event network-vif-unplugged-2b67cffe-9fbb-422c-86d7-1d5a1b3eabe6 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  5 07:57:35 np0005546954 nova_compute[187160]: 2025-12-05 12:57:35.961 187164 DEBUG nova.compute.manager [req-206210e9-b7e3-40d8-b11d-b03ff69fca33 req-aed656a3-46a9-43af-bd34-85ab3a176f39 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Received event network-vif-plugged-74c7c063-9c46-4bde-8e16-cab6f4e2e23c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:57:35 np0005546954 nova_compute[187160]: 2025-12-05 12:57:35.962 187164 DEBUG oslo_concurrency.lockutils [req-206210e9-b7e3-40d8-b11d-b03ff69fca33 req-aed656a3-46a9-43af-bd34-85ab3a176f39 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "4e3508c1-ecaa-442a-95c5-f5095e12912e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:57:35 np0005546954 nova_compute[187160]: 2025-12-05 12:57:35.962 187164 DEBUG oslo_concurrency.lockutils [req-206210e9-b7e3-40d8-b11d-b03ff69fca33 req-aed656a3-46a9-43af-bd34-85ab3a176f39 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "4e3508c1-ecaa-442a-95c5-f5095e12912e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:57:35 np0005546954 nova_compute[187160]: 2025-12-05 12:57:35.963 187164 DEBUG oslo_concurrency.lockutils [req-206210e9-b7e3-40d8-b11d-b03ff69fca33 req-aed656a3-46a9-43af-bd34-85ab3a176f39 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "4e3508c1-ecaa-442a-95c5-f5095e12912e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:57:35 np0005546954 nova_compute[187160]: 2025-12-05 12:57:35.963 187164 DEBUG nova.compute.manager [req-206210e9-b7e3-40d8-b11d-b03ff69fca33 req-aed656a3-46a9-43af-bd34-85ab3a176f39 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] No waiting events found dispatching network-vif-plugged-74c7c063-9c46-4bde-8e16-cab6f4e2e23c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:57:35 np0005546954 nova_compute[187160]: 2025-12-05 12:57:35.963 187164 WARNING nova.compute.manager [req-206210e9-b7e3-40d8-b11d-b03ff69fca33 req-aed656a3-46a9-43af-bd34-85ab3a176f39 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Received unexpected event network-vif-plugged-74c7c063-9c46-4bde-8e16-cab6f4e2e23c for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:57:35 np0005546954 nova_compute[187160]: 2025-12-05 12:57:35.964 187164 DEBUG nova.compute.manager [req-206210e9-b7e3-40d8-b11d-b03ff69fca33 req-aed656a3-46a9-43af-bd34-85ab3a176f39 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Received event network-vif-deleted-74c7c063-9c46-4bde-8e16-cab6f4e2e23c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:57:36 np0005546954 nova_compute[187160]: 2025-12-05 12:57:36.061 187164 DEBUG nova.network.neutron [-] [instance: 199cbeb6-984a-4e32-8ae3-766207e89849] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:57:36 np0005546954 nova_compute[187160]: 2025-12-05 12:57:36.081 187164 INFO nova.compute.manager [-] [instance: 199cbeb6-984a-4e32-8ae3-766207e89849] Took 0.61 seconds to deallocate network for instance.#033[00m
Dec  5 07:57:36 np0005546954 nova_compute[187160]: 2025-12-05 12:57:36.216 187164 DEBUG oslo_concurrency.lockutils [None req-2a67a345-64e8-4e94-a94b-091929ac203a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:57:36 np0005546954 nova_compute[187160]: 2025-12-05 12:57:36.217 187164 DEBUG oslo_concurrency.lockutils [None req-2a67a345-64e8-4e94-a94b-091929ac203a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:57:36 np0005546954 nova_compute[187160]: 2025-12-05 12:57:36.223 187164 DEBUG oslo_concurrency.lockutils [None req-2a67a345-64e8-4e94-a94b-091929ac203a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:57:36 np0005546954 nova_compute[187160]: 2025-12-05 12:57:36.622 187164 INFO nova.scheduler.client.report [None req-2a67a345-64e8-4e94-a94b-091929ac203a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Deleted allocations for instance 199cbeb6-984a-4e32-8ae3-766207e89849#033[00m
Dec  5 07:57:36 np0005546954 nova_compute[187160]: 2025-12-05 12:57:36.705 187164 DEBUG oslo_concurrency.lockutils [None req-2a67a345-64e8-4e94-a94b-091929ac203a 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "199cbeb6-984a-4e32-8ae3-766207e89849" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.570s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:57:37 np0005546954 nova_compute[187160]: 2025-12-05 12:57:37.991 187164 DEBUG nova.compute.manager [req-243d8731-4c04-405f-bdc5-32ad3ffb0d33 req-0fc9c174-2c98-490d-a9d1-8aa020a648bc 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 199cbeb6-984a-4e32-8ae3-766207e89849] Received event network-vif-plugged-2b67cffe-9fbb-422c-86d7-1d5a1b3eabe6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:57:37 np0005546954 nova_compute[187160]: 2025-12-05 12:57:37.991 187164 DEBUG oslo_concurrency.lockutils [req-243d8731-4c04-405f-bdc5-32ad3ffb0d33 req-0fc9c174-2c98-490d-a9d1-8aa020a648bc 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "199cbeb6-984a-4e32-8ae3-766207e89849-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:57:37 np0005546954 nova_compute[187160]: 2025-12-05 12:57:37.992 187164 DEBUG oslo_concurrency.lockutils [req-243d8731-4c04-405f-bdc5-32ad3ffb0d33 req-0fc9c174-2c98-490d-a9d1-8aa020a648bc 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "199cbeb6-984a-4e32-8ae3-766207e89849-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:57:37 np0005546954 nova_compute[187160]: 2025-12-05 12:57:37.992 187164 DEBUG oslo_concurrency.lockutils [req-243d8731-4c04-405f-bdc5-32ad3ffb0d33 req-0fc9c174-2c98-490d-a9d1-8aa020a648bc 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "199cbeb6-984a-4e32-8ae3-766207e89849-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:57:37 np0005546954 nova_compute[187160]: 2025-12-05 12:57:37.993 187164 DEBUG nova.compute.manager [req-243d8731-4c04-405f-bdc5-32ad3ffb0d33 req-0fc9c174-2c98-490d-a9d1-8aa020a648bc 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 199cbeb6-984a-4e32-8ae3-766207e89849] No waiting events found dispatching network-vif-plugged-2b67cffe-9fbb-422c-86d7-1d5a1b3eabe6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:57:37 np0005546954 nova_compute[187160]: 2025-12-05 12:57:37.993 187164 WARNING nova.compute.manager [req-243d8731-4c04-405f-bdc5-32ad3ffb0d33 req-0fc9c174-2c98-490d-a9d1-8aa020a648bc 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 199cbeb6-984a-4e32-8ae3-766207e89849] Received unexpected event network-vif-plugged-2b67cffe-9fbb-422c-86d7-1d5a1b3eabe6 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:57:38 np0005546954 nova_compute[187160]: 2025-12-05 12:57:38.037 187164 DEBUG nova.compute.manager [req-686a01ab-4fda-4bb5-b1d7-bd8efc2bea42 req-7c9e6453-a36f-4a91-be61-891ce326b5ae 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 199cbeb6-984a-4e32-8ae3-766207e89849] Received event network-vif-deleted-2b67cffe-9fbb-422c-86d7-1d5a1b3eabe6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:57:40 np0005546954 nova_compute[187160]: 2025-12-05 12:57:40.430 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:40 np0005546954 nova_compute[187160]: 2025-12-05 12:57:40.644 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:45 np0005546954 nova_compute[187160]: 2025-12-05 12:57:45.432 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:45 np0005546954 nova_compute[187160]: 2025-12-05 12:57:45.646 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:48 np0005546954 nova_compute[187160]: 2025-12-05 12:57:48.391 187164 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764939453.3895285, 4e3508c1-ecaa-442a-95c5-f5095e12912e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:57:48 np0005546954 nova_compute[187160]: 2025-12-05 12:57:48.391 187164 INFO nova.compute.manager [-] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:57:48 np0005546954 nova_compute[187160]: 2025-12-05 12:57:48.457 187164 DEBUG nova.compute.manager [None req-9c197bfd-f07a-406f-8e59-1f20ef41261f - - - - - -] [instance: 4e3508c1-ecaa-442a-95c5-f5095e12912e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:57:48 np0005546954 podman[214451]: 2025-12-05 12:57:48.566947981 +0000 UTC m=+0.073352248 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.buildah.version=1.33.7, name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_id=edpm, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, architecture=x86_64)
Dec  5 07:57:48 np0005546954 podman[214452]: 2025-12-05 12:57:48.600118885 +0000 UTC m=+0.101925138 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:57:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:57:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:57:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:57:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:57:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:57:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 07:57:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:57:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 07:57:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:57:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:57:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 07:57:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:57:50 np0005546954 nova_compute[187160]: 2025-12-05 12:57:50.402 187164 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764939455.4012856, 199cbeb6-984a-4e32-8ae3-766207e89849 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:57:50 np0005546954 nova_compute[187160]: 2025-12-05 12:57:50.403 187164 INFO nova.compute.manager [-] [instance: 199cbeb6-984a-4e32-8ae3-766207e89849] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:57:50 np0005546954 nova_compute[187160]: 2025-12-05 12:57:50.435 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:50 np0005546954 nova_compute[187160]: 2025-12-05 12:57:50.648 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:50 np0005546954 nova_compute[187160]: 2025-12-05 12:57:50.759 187164 DEBUG nova.compute.manager [None req-f1639f63-391d-430e-8316-b99f950dd3e2 - - - - - -] [instance: 199cbeb6-984a-4e32-8ae3-766207e89849] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:57:55 np0005546954 nova_compute[187160]: 2025-12-05 12:57:55.479 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:57:55 np0005546954 nova_compute[187160]: 2025-12-05 12:57:55.651 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:58:00 np0005546954 nova_compute[187160]: 2025-12-05 12:58:00.482 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:58:00 np0005546954 podman[214494]: 2025-12-05 12:58:00.527829117 +0000 UTC m=+0.046047467 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec  5 07:58:00 np0005546954 nova_compute[187160]: 2025-12-05 12:58:00.652 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:58:02 np0005546954 nova_compute[187160]: 2025-12-05 12:58:02.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:58:02 np0005546954 nova_compute[187160]: 2025-12-05 12:58:02.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:58:02 np0005546954 nova_compute[187160]: 2025-12-05 12:58:02.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:58:02 np0005546954 nova_compute[187160]: 2025-12-05 12:58:02.283 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 07:58:03 np0005546954 nova_compute[187160]: 2025-12-05 12:58:03.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:58:03 np0005546954 nova_compute[187160]: 2025-12-05 12:58:03.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:58:05 np0005546954 nova_compute[187160]: 2025-12-05 12:58:05.484 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:58:05 np0005546954 podman[214514]: 2025-12-05 12:58:05.540694376 +0000 UTC m=+0.051254458 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 07:58:05 np0005546954 podman[214513]: 2025-12-05 12:58:05.571749194 +0000 UTC m=+0.088764808 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  5 07:58:05 np0005546954 podman[197513]: time="2025-12-05T12:58:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:58:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:58:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 07:58:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:58:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2594 "" "Go-http-client/1.1"
Dec  5 07:58:05 np0005546954 nova_compute[187160]: 2025-12-05 12:58:05.654 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:58:06 np0005546954 nova_compute[187160]: 2025-12-05 12:58:06.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:58:09 np0005546954 nova_compute[187160]: 2025-12-05 12:58:09.037 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:58:09 np0005546954 nova_compute[187160]: 2025-12-05 12:58:09.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:58:09 np0005546954 nova_compute[187160]: 2025-12-05 12:58:09.060 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:58:09 np0005546954 nova_compute[187160]: 2025-12-05 12:58:09.060 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:58:09 np0005546954 nova_compute[187160]: 2025-12-05 12:58:09.061 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:58:09 np0005546954 nova_compute[187160]: 2025-12-05 12:58:09.061 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:58:09 np0005546954 nova_compute[187160]: 2025-12-05 12:58:09.222 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:58:09 np0005546954 nova_compute[187160]: 2025-12-05 12:58:09.224 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5851MB free_disk=73.3361930847168GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:58:09 np0005546954 nova_compute[187160]: 2025-12-05 12:58:09.224 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:58:09 np0005546954 nova_compute[187160]: 2025-12-05 12:58:09.224 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:58:09 np0005546954 nova_compute[187160]: 2025-12-05 12:58:09.303 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:58:09 np0005546954 nova_compute[187160]: 2025-12-05 12:58:09.303 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:58:09 np0005546954 nova_compute[187160]: 2025-12-05 12:58:09.327 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:58:09 np0005546954 nova_compute[187160]: 2025-12-05 12:58:09.343 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:58:09 np0005546954 nova_compute[187160]: 2025-12-05 12:58:09.372 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:58:09 np0005546954 nova_compute[187160]: 2025-12-05 12:58:09.372 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:58:10 np0005546954 nova_compute[187160]: 2025-12-05 12:58:10.491 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:58:10 np0005546954 nova_compute[187160]: 2025-12-05 12:58:10.656 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:58:12 np0005546954 nova_compute[187160]: 2025-12-05 12:58:12.373 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:58:14 np0005546954 nova_compute[187160]: 2025-12-05 12:58:14.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:58:14 np0005546954 nova_compute[187160]: 2025-12-05 12:58:14.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:58:15 np0005546954 nova_compute[187160]: 2025-12-05 12:58:15.496 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:58:15 np0005546954 nova_compute[187160]: 2025-12-05 12:58:15.657 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:58:16 np0005546954 nova_compute[187160]: 2025-12-05 12:58:16.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:58:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:58:16.958 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:58:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:58:16.959 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:58:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:58:16.959 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:58:18 np0005546954 nova_compute[187160]: 2025-12-05 12:58:18.007 187164 DEBUG oslo_concurrency.lockutils [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "4287ea95-6b7c-4583-ba8c-a9fdca606587" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:58:18 np0005546954 nova_compute[187160]: 2025-12-05 12:58:18.008 187164 DEBUG oslo_concurrency.lockutils [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "4287ea95-6b7c-4583-ba8c-a9fdca606587" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:58:18 np0005546954 nova_compute[187160]: 2025-12-05 12:58:18.031 187164 DEBUG nova.compute.manager [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:58:18 np0005546954 nova_compute[187160]: 2025-12-05 12:58:18.107 187164 DEBUG oslo_concurrency.lockutils [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:58:18 np0005546954 nova_compute[187160]: 2025-12-05 12:58:18.108 187164 DEBUG oslo_concurrency.lockutils [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:58:18 np0005546954 nova_compute[187160]: 2025-12-05 12:58:18.116 187164 DEBUG nova.virt.hardware [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:58:18 np0005546954 nova_compute[187160]: 2025-12-05 12:58:18.116 187164 INFO nova.compute.claims [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Claim successful on node compute-1.ctlplane.example.com#033[00m
Dec  5 07:58:18 np0005546954 nova_compute[187160]: 2025-12-05 12:58:18.224 187164 DEBUG nova.compute.provider_tree [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:58:18 np0005546954 nova_compute[187160]: 2025-12-05 12:58:18.245 187164 DEBUG nova.scheduler.client.report [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:58:18 np0005546954 nova_compute[187160]: 2025-12-05 12:58:18.277 187164 DEBUG oslo_concurrency.lockutils [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:58:18 np0005546954 nova_compute[187160]: 2025-12-05 12:58:18.278 187164 DEBUG nova.compute.manager [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:58:18 np0005546954 nova_compute[187160]: 2025-12-05 12:58:18.327 187164 DEBUG nova.compute.manager [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:58:18 np0005546954 nova_compute[187160]: 2025-12-05 12:58:18.329 187164 DEBUG nova.network.neutron [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:58:18 np0005546954 nova_compute[187160]: 2025-12-05 12:58:18.352 187164 INFO nova.virt.libvirt.driver [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:58:18 np0005546954 nova_compute[187160]: 2025-12-05 12:58:18.372 187164 DEBUG nova.compute.manager [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:58:18 np0005546954 nova_compute[187160]: 2025-12-05 12:58:18.452 187164 DEBUG nova.compute.manager [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:58:18 np0005546954 nova_compute[187160]: 2025-12-05 12:58:18.454 187164 DEBUG nova.virt.libvirt.driver [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:58:18 np0005546954 nova_compute[187160]: 2025-12-05 12:58:18.455 187164 INFO nova.virt.libvirt.driver [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Creating image(s)#033[00m
Dec  5 07:58:18 np0005546954 nova_compute[187160]: 2025-12-05 12:58:18.456 187164 DEBUG oslo_concurrency.lockutils [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "/var/lib/nova/instances/4287ea95-6b7c-4583-ba8c-a9fdca606587/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:58:18 np0005546954 nova_compute[187160]: 2025-12-05 12:58:18.457 187164 DEBUG oslo_concurrency.lockutils [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "/var/lib/nova/instances/4287ea95-6b7c-4583-ba8c-a9fdca606587/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:58:18 np0005546954 nova_compute[187160]: 2025-12-05 12:58:18.458 187164 DEBUG oslo_concurrency.lockutils [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "/var/lib/nova/instances/4287ea95-6b7c-4583-ba8c-a9fdca606587/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:58:18 np0005546954 nova_compute[187160]: 2025-12-05 12:58:18.483 187164 DEBUG oslo_concurrency.processutils [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:58:18 np0005546954 nova_compute[187160]: 2025-12-05 12:58:18.577 187164 DEBUG oslo_concurrency.processutils [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:58:18 np0005546954 nova_compute[187160]: 2025-12-05 12:58:18.579 187164 DEBUG oslo_concurrency.lockutils [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:58:18 np0005546954 nova_compute[187160]: 2025-12-05 12:58:18.580 187164 DEBUG oslo_concurrency.lockutils [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:58:18 np0005546954 nova_compute[187160]: 2025-12-05 12:58:18.605 187164 DEBUG oslo_concurrency.processutils [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:58:18 np0005546954 nova_compute[187160]: 2025-12-05 12:58:18.690 187164 DEBUG oslo_concurrency.processutils [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:58:18 np0005546954 nova_compute[187160]: 2025-12-05 12:58:18.692 187164 DEBUG oslo_concurrency.processutils [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/4287ea95-6b7c-4583-ba8c-a9fdca606587/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:58:18 np0005546954 nova_compute[187160]: 2025-12-05 12:58:18.731 187164 DEBUG nova.policy [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0ae0bb20ac8b4be99eb1abddc7310436', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e6ae0d0dcde04b85b6dae45560cca988', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:58:18 np0005546954 nova_compute[187160]: 2025-12-05 12:58:18.736 187164 DEBUG oslo_concurrency.processutils [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/4287ea95-6b7c-4583-ba8c-a9fdca606587/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:58:18 np0005546954 nova_compute[187160]: 2025-12-05 12:58:18.737 187164 DEBUG oslo_concurrency.lockutils [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.156s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:58:18 np0005546954 nova_compute[187160]: 2025-12-05 12:58:18.737 187164 DEBUG oslo_concurrency.processutils [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:58:18 np0005546954 nova_compute[187160]: 2025-12-05 12:58:18.793 187164 DEBUG oslo_concurrency.processutils [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:58:18 np0005546954 nova_compute[187160]: 2025-12-05 12:58:18.794 187164 DEBUG nova.virt.disk.api [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Checking if we can resize image /var/lib/nova/instances/4287ea95-6b7c-4583-ba8c-a9fdca606587/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:58:18 np0005546954 nova_compute[187160]: 2025-12-05 12:58:18.795 187164 DEBUG oslo_concurrency.processutils [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4287ea95-6b7c-4583-ba8c-a9fdca606587/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:58:18 np0005546954 nova_compute[187160]: 2025-12-05 12:58:18.867 187164 DEBUG oslo_concurrency.processutils [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4287ea95-6b7c-4583-ba8c-a9fdca606587/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:58:18 np0005546954 nova_compute[187160]: 2025-12-05 12:58:18.868 187164 DEBUG nova.virt.disk.api [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Cannot resize image /var/lib/nova/instances/4287ea95-6b7c-4583-ba8c-a9fdca606587/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:58:18 np0005546954 nova_compute[187160]: 2025-12-05 12:58:18.869 187164 DEBUG nova.objects.instance [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lazy-loading 'migration_context' on Instance uuid 4287ea95-6b7c-4583-ba8c-a9fdca606587 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:58:18 np0005546954 nova_compute[187160]: 2025-12-05 12:58:18.884 187164 DEBUG nova.virt.libvirt.driver [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:58:18 np0005546954 nova_compute[187160]: 2025-12-05 12:58:18.884 187164 DEBUG nova.virt.libvirt.driver [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Ensure instance console log exists: /var/lib/nova/instances/4287ea95-6b7c-4583-ba8c-a9fdca606587/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:58:18 np0005546954 nova_compute[187160]: 2025-12-05 12:58:18.885 187164 DEBUG oslo_concurrency.lockutils [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:58:18 np0005546954 nova_compute[187160]: 2025-12-05 12:58:18.885 187164 DEBUG oslo_concurrency.lockutils [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:58:18 np0005546954 nova_compute[187160]: 2025-12-05 12:58:18.886 187164 DEBUG oslo_concurrency.lockutils [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:58:19 np0005546954 podman[214579]: 2025-12-05 12:58:19.300810321 +0000 UTC m=+0.060672042 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, maintainer=Red Hat, Inc., release=1755695350, io.buildah.version=1.33.7, version=9.6, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, build-date=2025-08-20T13:12:41, config_id=edpm, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible)
Dec  5 07:58:19 np0005546954 podman[214580]: 2025-12-05 12:58:19.329908339 +0000 UTC m=+0.084224897 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  5 07:58:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:58:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 07:58:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:58:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:58:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:58:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:58:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:58:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 07:58:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:58:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:58:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 07:58:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:58:19 np0005546954 nova_compute[187160]: 2025-12-05 12:58:19.702 187164 DEBUG nova.network.neutron [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Successfully created port: 387def1f-1379-4223-bac4-15b131c92566 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:58:20 np0005546954 nova_compute[187160]: 2025-12-05 12:58:20.499 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:58:20 np0005546954 nova_compute[187160]: 2025-12-05 12:58:20.659 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:58:20 np0005546954 nova_compute[187160]: 2025-12-05 12:58:20.857 187164 DEBUG nova.network.neutron [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Successfully updated port: 387def1f-1379-4223-bac4-15b131c92566 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:58:20 np0005546954 nova_compute[187160]: 2025-12-05 12:58:20.877 187164 DEBUG oslo_concurrency.lockutils [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "refresh_cache-4287ea95-6b7c-4583-ba8c-a9fdca606587" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:58:20 np0005546954 nova_compute[187160]: 2025-12-05 12:58:20.877 187164 DEBUG oslo_concurrency.lockutils [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquired lock "refresh_cache-4287ea95-6b7c-4583-ba8c-a9fdca606587" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:58:20 np0005546954 nova_compute[187160]: 2025-12-05 12:58:20.878 187164 DEBUG nova.network.neutron [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:58:20 np0005546954 nova_compute[187160]: 2025-12-05 12:58:20.944 187164 DEBUG nova.compute.manager [req-8b37ec82-e700-4cf0-8541-e7ae241417b1 req-4519fecf-4ffa-4e9e-867a-a91acf32ec21 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Received event network-changed-387def1f-1379-4223-bac4-15b131c92566 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:58:20 np0005546954 nova_compute[187160]: 2025-12-05 12:58:20.945 187164 DEBUG nova.compute.manager [req-8b37ec82-e700-4cf0-8541-e7ae241417b1 req-4519fecf-4ffa-4e9e-867a-a91acf32ec21 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Refreshing instance network info cache due to event network-changed-387def1f-1379-4223-bac4-15b131c92566. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:58:20 np0005546954 nova_compute[187160]: 2025-12-05 12:58:20.945 187164 DEBUG oslo_concurrency.lockutils [req-8b37ec82-e700-4cf0-8541-e7ae241417b1 req-4519fecf-4ffa-4e9e-867a-a91acf32ec21 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "refresh_cache-4287ea95-6b7c-4583-ba8c-a9fdca606587" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:58:21 np0005546954 nova_compute[187160]: 2025-12-05 12:58:21.054 187164 DEBUG nova.network.neutron [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.046 187164 DEBUG nova.network.neutron [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Updating instance_info_cache with network_info: [{"id": "387def1f-1379-4223-bac4-15b131c92566", "address": "fa:16:3e:2f:77:70", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap387def1f-13", "ovs_interfaceid": "387def1f-1379-4223-bac4-15b131c92566", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.071 187164 DEBUG oslo_concurrency.lockutils [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Releasing lock "refresh_cache-4287ea95-6b7c-4583-ba8c-a9fdca606587" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.071 187164 DEBUG nova.compute.manager [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Instance network_info: |[{"id": "387def1f-1379-4223-bac4-15b131c92566", "address": "fa:16:3e:2f:77:70", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap387def1f-13", "ovs_interfaceid": "387def1f-1379-4223-bac4-15b131c92566", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.072 187164 DEBUG oslo_concurrency.lockutils [req-8b37ec82-e700-4cf0-8541-e7ae241417b1 req-4519fecf-4ffa-4e9e-867a-a91acf32ec21 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquired lock "refresh_cache-4287ea95-6b7c-4583-ba8c-a9fdca606587" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.072 187164 DEBUG nova.network.neutron [req-8b37ec82-e700-4cf0-8541-e7ae241417b1 req-4519fecf-4ffa-4e9e-867a-a91acf32ec21 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Refreshing network info cache for port 387def1f-1379-4223-bac4-15b131c92566 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.075 187164 DEBUG nova.virt.libvirt.driver [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Start _get_guest_xml network_info=[{"id": "387def1f-1379-4223-bac4-15b131c92566", "address": "fa:16:3e:2f:77:70", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap387def1f-13", "ovs_interfaceid": "387def1f-1379-4223-bac4-15b131c92566", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T12:39:17Z,direct_url=<?>,disk_format='qcow2',id=f4c3125a-6fd0-40bb-aa00-a7e736ee853d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='83916c53de6f404f91206339303e1b23',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T12:39:21Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'size': 0, 'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'encrypted': False, 'image_id': 'f4c3125a-6fd0-40bb-aa00-a7e736ee853d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.078 187164 WARNING nova.virt.libvirt.driver [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.082 187164 DEBUG nova.virt.libvirt.host [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.083 187164 DEBUG nova.virt.libvirt.host [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.090 187164 DEBUG nova.virt.libvirt.host [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.090 187164 DEBUG nova.virt.libvirt.host [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.092 187164 DEBUG nova.virt.libvirt.driver [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.093 187164 DEBUG nova.virt.hardware [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T12:39:15Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b4ea63be-97f8-4a48-b000-66321c4ddb27',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T12:39:17Z,direct_url=<?>,disk_format='qcow2',id=f4c3125a-6fd0-40bb-aa00-a7e736ee853d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='83916c53de6f404f91206339303e1b23',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T12:39:21Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.093 187164 DEBUG nova.virt.hardware [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.093 187164 DEBUG nova.virt.hardware [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.094 187164 DEBUG nova.virt.hardware [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.094 187164 DEBUG nova.virt.hardware [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.094 187164 DEBUG nova.virt.hardware [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.095 187164 DEBUG nova.virt.hardware [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.095 187164 DEBUG nova.virt.hardware [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.096 187164 DEBUG nova.virt.hardware [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.096 187164 DEBUG nova.virt.hardware [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.096 187164 DEBUG nova.virt.hardware [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.100 187164 DEBUG nova.virt.libvirt.vif [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:58:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-252433423',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-252433423',id=18,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e6ae0d0dcde04b85b6dae45560cca988',ramdisk_id='',reservation_id='r-gihsj0d3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-192029678',owner_user_name='tempest-TestExecuteStrategies-192029678-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:58:18Z,user_data=None,user_id='0ae0bb20ac8b4be99eb1abddc7310436',uuid=4287ea95-6b7c-4583-ba8c-a9fdca606587,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "387def1f-1379-4223-bac4-15b131c92566", "address": "fa:16:3e:2f:77:70", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap387def1f-13", "ovs_interfaceid": "387def1f-1379-4223-bac4-15b131c92566", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.101 187164 DEBUG nova.network.os_vif_util [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converting VIF {"id": "387def1f-1379-4223-bac4-15b131c92566", "address": "fa:16:3e:2f:77:70", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap387def1f-13", "ovs_interfaceid": "387def1f-1379-4223-bac4-15b131c92566", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.102 187164 DEBUG nova.network.os_vif_util [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2f:77:70,bridge_name='br-int',has_traffic_filtering=True,id=387def1f-1379-4223-bac4-15b131c92566,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap387def1f-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.103 187164 DEBUG nova.objects.instance [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4287ea95-6b7c-4583-ba8c-a9fdca606587 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.116 187164 DEBUG nova.virt.libvirt.driver [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:58:22 np0005546954 nova_compute[187160]:  <uuid>4287ea95-6b7c-4583-ba8c-a9fdca606587</uuid>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:  <name>instance-00000012</name>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:  <memory>131072</memory>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:  <vcpu>1</vcpu>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:  <metadata>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:58:22 np0005546954 nova_compute[187160]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:      <nova:name>tempest-TestExecuteStrategies-server-252433423</nova:name>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:      <nova:creationTime>2025-12-05 12:58:22</nova:creationTime>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:      <nova:flavor name="m1.nano">
Dec  5 07:58:22 np0005546954 nova_compute[187160]:        <nova:memory>128</nova:memory>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:        <nova:disk>1</nova:disk>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:        <nova:swap>0</nova:swap>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:      </nova:flavor>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:      <nova:owner>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:        <nova:user uuid="0ae0bb20ac8b4be99eb1abddc7310436">tempest-TestExecuteStrategies-192029678-project-member</nova:user>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:        <nova:project uuid="e6ae0d0dcde04b85b6dae45560cca988">tempest-TestExecuteStrategies-192029678</nova:project>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:      </nova:owner>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:      <nova:root type="image" uuid="f4c3125a-6fd0-40bb-aa00-a7e736ee853d"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:      <nova:ports>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:        <nova:port uuid="387def1f-1379-4223-bac4-15b131c92566">
Dec  5 07:58:22 np0005546954 nova_compute[187160]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:        </nova:port>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:      </nova:ports>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    </nova:instance>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:  </metadata>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:  <sysinfo type="smbios">
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <system>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:      <entry name="serial">4287ea95-6b7c-4583-ba8c-a9fdca606587</entry>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:      <entry name="uuid">4287ea95-6b7c-4583-ba8c-a9fdca606587</entry>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    </system>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:  </sysinfo>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:  <os>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <boot dev="hd"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <smbios mode="sysinfo"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:  </os>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:  <features>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <acpi/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <apic/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <vmcoreinfo/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:  </features>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:  <clock offset="utc">
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <timer name="hpet" present="no"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:  </clock>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:  <cpu mode="custom" match="exact">
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <model>Nehalem</model>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:  </cpu>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:  <devices>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <disk type="file" device="disk">
Dec  5 07:58:22 np0005546954 nova_compute[187160]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:      <source file="/var/lib/nova/instances/4287ea95-6b7c-4583-ba8c-a9fdca606587/disk"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:      <target dev="vda" bus="virtio"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    </disk>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <disk type="file" device="cdrom">
Dec  5 07:58:22 np0005546954 nova_compute[187160]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:      <source file="/var/lib/nova/instances/4287ea95-6b7c-4583-ba8c-a9fdca606587/disk.config"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:      <target dev="sda" bus="sata"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    </disk>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <interface type="ethernet">
Dec  5 07:58:22 np0005546954 nova_compute[187160]:      <mac address="fa:16:3e:2f:77:70"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:      <model type="virtio"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:      <mtu size="1442"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:      <target dev="tap387def1f-13"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    </interface>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <serial type="pty">
Dec  5 07:58:22 np0005546954 nova_compute[187160]:      <log file="/var/lib/nova/instances/4287ea95-6b7c-4583-ba8c-a9fdca606587/console.log" append="off"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    </serial>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <video>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:      <model type="virtio"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    </video>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <input type="tablet" bus="usb"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <rng model="virtio">
Dec  5 07:58:22 np0005546954 nova_compute[187160]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    </rng>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <controller type="usb" index="0"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    <memballoon model="virtio">
Dec  5 07:58:22 np0005546954 nova_compute[187160]:      <stats period="10"/>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:    </memballoon>
Dec  5 07:58:22 np0005546954 nova_compute[187160]:  </devices>
Dec  5 07:58:22 np0005546954 nova_compute[187160]: </domain>
Dec  5 07:58:22 np0005546954 nova_compute[187160]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.118 187164 DEBUG nova.compute.manager [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Preparing to wait for external event network-vif-plugged-387def1f-1379-4223-bac4-15b131c92566 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.118 187164 DEBUG oslo_concurrency.lockutils [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "4287ea95-6b7c-4583-ba8c-a9fdca606587-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.118 187164 DEBUG oslo_concurrency.lockutils [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "4287ea95-6b7c-4583-ba8c-a9fdca606587-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.119 187164 DEBUG oslo_concurrency.lockutils [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "4287ea95-6b7c-4583-ba8c-a9fdca606587-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.119 187164 DEBUG nova.virt.libvirt.vif [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:58:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-252433423',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-252433423',id=18,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e6ae0d0dcde04b85b6dae45560cca988',ramdisk_id='',reservation_id='r-gihsj0d3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-192029678',owner_user_name='tempest-TestExecuteStrategies-192029678-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:58:18Z,user_data=None,user_id='0ae0bb20ac8b4be99eb1abddc7310436',uuid=4287ea95-6b7c-4583-ba8c-a9fdca606587,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "387def1f-1379-4223-bac4-15b131c92566", "address": "fa:16:3e:2f:77:70", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap387def1f-13", "ovs_interfaceid": "387def1f-1379-4223-bac4-15b131c92566", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.120 187164 DEBUG nova.network.os_vif_util [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converting VIF {"id": "387def1f-1379-4223-bac4-15b131c92566", "address": "fa:16:3e:2f:77:70", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap387def1f-13", "ovs_interfaceid": "387def1f-1379-4223-bac4-15b131c92566", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.120 187164 DEBUG nova.network.os_vif_util [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2f:77:70,bridge_name='br-int',has_traffic_filtering=True,id=387def1f-1379-4223-bac4-15b131c92566,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap387def1f-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.121 187164 DEBUG os_vif [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2f:77:70,bridge_name='br-int',has_traffic_filtering=True,id=387def1f-1379-4223-bac4-15b131c92566,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap387def1f-13') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.121 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.122 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.122 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.125 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.125 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap387def1f-13, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.125 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap387def1f-13, col_values=(('external_ids', {'iface-id': '387def1f-1379-4223-bac4-15b131c92566', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2f:77:70', 'vm-uuid': '4287ea95-6b7c-4583-ba8c-a9fdca606587'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.127 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:58:22 np0005546954 NetworkManager[55665]: <info>  [1764939502.1283] manager: (tap387def1f-13): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/68)
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.129 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.133 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.134 187164 INFO os_vif [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2f:77:70,bridge_name='br-int',has_traffic_filtering=True,id=387def1f-1379-4223-bac4-15b131c92566,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap387def1f-13')#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.183 187164 DEBUG nova.virt.libvirt.driver [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.184 187164 DEBUG nova.virt.libvirt.driver [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.184 187164 DEBUG nova.virt.libvirt.driver [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] No VIF found with MAC fa:16:3e:2f:77:70, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.185 187164 INFO nova.virt.libvirt.driver [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Using config drive#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.505 187164 INFO nova.virt.libvirt.driver [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Creating config drive at /var/lib/nova/instances/4287ea95-6b7c-4583-ba8c-a9fdca606587/disk.config#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.511 187164 DEBUG oslo_concurrency.processutils [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4287ea95-6b7c-4583-ba8c-a9fdca606587/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpb1jaaf_0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.651 187164 DEBUG oslo_concurrency.processutils [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4287ea95-6b7c-4583-ba8c-a9fdca606587/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpb1jaaf_0" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:58:22 np0005546954 kernel: tap387def1f-13: entered promiscuous mode
Dec  5 07:58:22 np0005546954 NetworkManager[55665]: <info>  [1764939502.7065] manager: (tap387def1f-13): new Tun device (/org/freedesktop/NetworkManager/Devices/69)
Dec  5 07:58:22 np0005546954 ovn_controller[95566]: 2025-12-05T12:58:22Z|00170|binding|INFO|Claiming lport 387def1f-1379-4223-bac4-15b131c92566 for this chassis.
Dec  5 07:58:22 np0005546954 ovn_controller[95566]: 2025-12-05T12:58:22Z|00171|binding|INFO|387def1f-1379-4223-bac4-15b131c92566: Claiming fa:16:3e:2f:77:70 10.100.0.13
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.707 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:58:22 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:58:22.714 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2f:77:70 10.100.0.13'], port_security=['fa:16:3e:2f:77:70 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '4287ea95-6b7c-4583-ba8c-a9fdca606587', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4389bc8-2898-48b0-9741-5183b54fe83c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e6ae0d0dcde04b85b6dae45560cca988', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9ea68f98-ae7c-4c35-bc5a-7c1a27f7e5f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb60c317-acba-4c06-b29b-f7c6c7a5660a, chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=387def1f-1379-4223-bac4-15b131c92566) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:58:22 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:58:22.716 104428 INFO neutron.agent.ovn.metadata.agent [-] Port 387def1f-1379-4223-bac4-15b131c92566 in datapath d4389bc8-2898-48b0-9741-5183b54fe83c bound to our chassis#033[00m
Dec  5 07:58:22 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:58:22.717 104428 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d4389bc8-2898-48b0-9741-5183b54fe83c#033[00m
Dec  5 07:58:22 np0005546954 ovn_controller[95566]: 2025-12-05T12:58:22Z|00172|binding|INFO|Setting lport 387def1f-1379-4223-bac4-15b131c92566 ovn-installed in OVS
Dec  5 07:58:22 np0005546954 ovn_controller[95566]: 2025-12-05T12:58:22Z|00173|binding|INFO|Setting lport 387def1f-1379-4223-bac4-15b131c92566 up in Southbound
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.721 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:58:22 np0005546954 nova_compute[187160]: 2025-12-05 12:58:22.723 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:58:22 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:58:22.732 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[5e172d80-0397-449c-8f90-2f08c5581f8c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:58:22 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:58:22.733 104428 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd4389bc8-21 in ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:58:22 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:58:22.735 208690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd4389bc8-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:58:22 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:58:22.735 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[ecbbdd4c-3b82-4470-9413-f235b74f7ecb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:58:22 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:58:22.736 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[f8aaea59-5f81-4abd-b44b-424c0349ce5f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:58:22 np0005546954 systemd-udevd[214637]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:58:22 np0005546954 systemd-machined[153497]: New machine qemu-16-instance-00000012.
Dec  5 07:58:22 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:58:22.747 104542 DEBUG oslo.privsep.daemon [-] privsep: reply[ec05c259-76ea-4d70-a385-67f674dcfd1e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:58:22 np0005546954 NetworkManager[55665]: <info>  [1764939502.7500] device (tap387def1f-13): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:58:22 np0005546954 NetworkManager[55665]: <info>  [1764939502.7514] device (tap387def1f-13): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:58:22 np0005546954 systemd[1]: Started Virtual Machine qemu-16-instance-00000012.
Dec  5 07:58:22 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:58:22.773 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[d9990e22-effc-4751-a86c-76c21372cecc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:58:22 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:58:22.799 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[192001cb-5591-453e-9603-89b116e60643]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:58:22 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:58:22.804 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[adad2234-5e70-4251-96ad-b5155fab39c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:58:22 np0005546954 systemd-udevd[214641]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:58:22 np0005546954 NetworkManager[55665]: <info>  [1764939502.8064] manager: (tapd4389bc8-20): new Veth device (/org/freedesktop/NetworkManager/Devices/70)
Dec  5 07:58:22 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:58:22.834 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[10e09af8-51ac-499a-aa78-74b96ef4d0f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:58:22 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:58:22.837 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[df2efb4d-c7e0-4e3c-a82e-4b53a9324853]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:58:22 np0005546954 NetworkManager[55665]: <info>  [1764939502.8568] device (tapd4389bc8-20): carrier: link connected
Dec  5 07:58:22 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:58:22.861 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[1deb3597-8cf1-468d-ac3f-03fa51638386]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:58:22 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:58:22.875 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[d289384e-df8c-4002-a7ff-6708fa80c6d9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd4389bc8-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:43:f7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 48], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 450948, 'reachable_time': 39152, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214670, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:58:22 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:58:22.892 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[52ba05c4-070f-48ec-bee7-bb45a11662d2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7c:43f7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 450948, 'tstamp': 450948}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214671, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:58:22 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:58:22.911 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[4984a6b7-2bf7-4d51-b640-ce7d27df4bbf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd4389bc8-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:43:f7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 48], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 450948, 'reachable_time': 39152, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214672, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:58:22 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:58:22.942 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[7036164e-9576-4f52-b2f1-bea857655974]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:58:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:58:23.008 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[2cf99619-9c7e-40b1-a4ab-c6426f62afdc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:58:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:58:23.009 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4389bc8-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:58:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:58:23.010 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:58:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:58:23.010 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd4389bc8-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:58:23 np0005546954 kernel: tapd4389bc8-20: entered promiscuous mode
Dec  5 07:58:23 np0005546954 NetworkManager[55665]: <info>  [1764939503.0129] manager: (tapd4389bc8-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/71)
Dec  5 07:58:23 np0005546954 nova_compute[187160]: 2025-12-05 12:58:23.012 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:58:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:58:23.019 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd4389bc8-20, col_values=(('external_ids', {'iface-id': '8dbe2af5-9f18-44ca-8f22-66854bcdd596'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:58:23 np0005546954 ovn_controller[95566]: 2025-12-05T12:58:23Z|00174|binding|INFO|Releasing lport 8dbe2af5-9f18-44ca-8f22-66854bcdd596 from this chassis (sb_readonly=0)
Dec  5 07:58:23 np0005546954 nova_compute[187160]: 2025-12-05 12:58:23.021 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:58:23 np0005546954 nova_compute[187160]: 2025-12-05 12:58:23.031 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:58:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:58:23.033 104428 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d4389bc8-2898-48b0-9741-5183b54fe83c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d4389bc8-2898-48b0-9741-5183b54fe83c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:58:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:58:23.034 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[8f7ba586-44a6-4175-9153-bc9917b6a083]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:58:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:58:23.035 104428 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:58:23 np0005546954 ovn_metadata_agent[104423]: global
Dec  5 07:58:23 np0005546954 ovn_metadata_agent[104423]:    log         /dev/log local0 debug
Dec  5 07:58:23 np0005546954 ovn_metadata_agent[104423]:    log-tag     haproxy-metadata-proxy-d4389bc8-2898-48b0-9741-5183b54fe83c
Dec  5 07:58:23 np0005546954 ovn_metadata_agent[104423]:    user        root
Dec  5 07:58:23 np0005546954 ovn_metadata_agent[104423]:    group       root
Dec  5 07:58:23 np0005546954 ovn_metadata_agent[104423]:    maxconn     1024
Dec  5 07:58:23 np0005546954 ovn_metadata_agent[104423]:    pidfile     /var/lib/neutron/external/pids/d4389bc8-2898-48b0-9741-5183b54fe83c.pid.haproxy
Dec  5 07:58:23 np0005546954 ovn_metadata_agent[104423]:    daemon
Dec  5 07:58:23 np0005546954 ovn_metadata_agent[104423]: 
Dec  5 07:58:23 np0005546954 ovn_metadata_agent[104423]: defaults
Dec  5 07:58:23 np0005546954 ovn_metadata_agent[104423]:    log global
Dec  5 07:58:23 np0005546954 ovn_metadata_agent[104423]:    mode http
Dec  5 07:58:23 np0005546954 ovn_metadata_agent[104423]:    option httplog
Dec  5 07:58:23 np0005546954 ovn_metadata_agent[104423]:    option dontlognull
Dec  5 07:58:23 np0005546954 ovn_metadata_agent[104423]:    option http-server-close
Dec  5 07:58:23 np0005546954 ovn_metadata_agent[104423]:    option forwardfor
Dec  5 07:58:23 np0005546954 ovn_metadata_agent[104423]:    retries                 3
Dec  5 07:58:23 np0005546954 ovn_metadata_agent[104423]:    timeout http-request    30s
Dec  5 07:58:23 np0005546954 ovn_metadata_agent[104423]:    timeout connect         30s
Dec  5 07:58:23 np0005546954 ovn_metadata_agent[104423]:    timeout client          32s
Dec  5 07:58:23 np0005546954 ovn_metadata_agent[104423]:    timeout server          32s
Dec  5 07:58:23 np0005546954 ovn_metadata_agent[104423]:    timeout http-keep-alive 30s
Dec  5 07:58:23 np0005546954 ovn_metadata_agent[104423]: 
Dec  5 07:58:23 np0005546954 ovn_metadata_agent[104423]: 
Dec  5 07:58:23 np0005546954 ovn_metadata_agent[104423]: listen listener
Dec  5 07:58:23 np0005546954 ovn_metadata_agent[104423]:    bind 169.254.169.254:80
Dec  5 07:58:23 np0005546954 ovn_metadata_agent[104423]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:58:23 np0005546954 ovn_metadata_agent[104423]:    http-request add-header X-OVN-Network-ID d4389bc8-2898-48b0-9741-5183b54fe83c
Dec  5 07:58:23 np0005546954 ovn_metadata_agent[104423]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:58:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:58:23.036 104428 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'env', 'PROCESS_TAG=haproxy-d4389bc8-2898-48b0-9741-5183b54fe83c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d4389bc8-2898-48b0-9741-5183b54fe83c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:58:23 np0005546954 nova_compute[187160]: 2025-12-05 12:58:23.042 187164 DEBUG nova.compute.manager [req-5604e8ae-04f1-4f1d-899f-2bf415f1f0b5 req-af3f9308-38c4-49c9-aa51-4eb1ca72a83c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Received event network-vif-plugged-387def1f-1379-4223-bac4-15b131c92566 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:58:23 np0005546954 nova_compute[187160]: 2025-12-05 12:58:23.043 187164 DEBUG oslo_concurrency.lockutils [req-5604e8ae-04f1-4f1d-899f-2bf415f1f0b5 req-af3f9308-38c4-49c9-aa51-4eb1ca72a83c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "4287ea95-6b7c-4583-ba8c-a9fdca606587-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:58:23 np0005546954 nova_compute[187160]: 2025-12-05 12:58:23.043 187164 DEBUG oslo_concurrency.lockutils [req-5604e8ae-04f1-4f1d-899f-2bf415f1f0b5 req-af3f9308-38c4-49c9-aa51-4eb1ca72a83c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "4287ea95-6b7c-4583-ba8c-a9fdca606587-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:58:23 np0005546954 nova_compute[187160]: 2025-12-05 12:58:23.043 187164 DEBUG oslo_concurrency.lockutils [req-5604e8ae-04f1-4f1d-899f-2bf415f1f0b5 req-af3f9308-38c4-49c9-aa51-4eb1ca72a83c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "4287ea95-6b7c-4583-ba8c-a9fdca606587-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:58:23 np0005546954 nova_compute[187160]: 2025-12-05 12:58:23.044 187164 DEBUG nova.compute.manager [req-5604e8ae-04f1-4f1d-899f-2bf415f1f0b5 req-af3f9308-38c4-49c9-aa51-4eb1ca72a83c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Processing event network-vif-plugged-387def1f-1379-4223-bac4-15b131c92566 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:58:23 np0005546954 nova_compute[187160]: 2025-12-05 12:58:23.131 187164 DEBUG nova.network.neutron [req-8b37ec82-e700-4cf0-8541-e7ae241417b1 req-4519fecf-4ffa-4e9e-867a-a91acf32ec21 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Updated VIF entry in instance network info cache for port 387def1f-1379-4223-bac4-15b131c92566. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:58:23 np0005546954 nova_compute[187160]: 2025-12-05 12:58:23.132 187164 DEBUG nova.network.neutron [req-8b37ec82-e700-4cf0-8541-e7ae241417b1 req-4519fecf-4ffa-4e9e-867a-a91acf32ec21 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Updating instance_info_cache with network_info: [{"id": "387def1f-1379-4223-bac4-15b131c92566", "address": "fa:16:3e:2f:77:70", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap387def1f-13", "ovs_interfaceid": "387def1f-1379-4223-bac4-15b131c92566", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:58:23 np0005546954 nova_compute[187160]: 2025-12-05 12:58:23.191 187164 DEBUG oslo_concurrency.lockutils [req-8b37ec82-e700-4cf0-8541-e7ae241417b1 req-4519fecf-4ffa-4e9e-867a-a91acf32ec21 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Releasing lock "refresh_cache-4287ea95-6b7c-4583-ba8c-a9fdca606587" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:58:23 np0005546954 nova_compute[187160]: 2025-12-05 12:58:23.305 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764939503.3053799, 4287ea95-6b7c-4583-ba8c-a9fdca606587 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:58:23 np0005546954 nova_compute[187160]: 2025-12-05 12:58:23.306 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] VM Started (Lifecycle Event)#033[00m
Dec  5 07:58:23 np0005546954 nova_compute[187160]: 2025-12-05 12:58:23.308 187164 DEBUG nova.compute.manager [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:58:23 np0005546954 nova_compute[187160]: 2025-12-05 12:58:23.313 187164 DEBUG nova.virt.libvirt.driver [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:58:23 np0005546954 nova_compute[187160]: 2025-12-05 12:58:23.316 187164 INFO nova.virt.libvirt.driver [-] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Instance spawned successfully.#033[00m
Dec  5 07:58:23 np0005546954 nova_compute[187160]: 2025-12-05 12:58:23.316 187164 DEBUG nova.virt.libvirt.driver [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:58:23 np0005546954 nova_compute[187160]: 2025-12-05 12:58:23.322 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:58:23 np0005546954 nova_compute[187160]: 2025-12-05 12:58:23.325 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:58:23 np0005546954 nova_compute[187160]: 2025-12-05 12:58:23.333 187164 DEBUG nova.virt.libvirt.driver [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:58:23 np0005546954 nova_compute[187160]: 2025-12-05 12:58:23.333 187164 DEBUG nova.virt.libvirt.driver [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:58:23 np0005546954 nova_compute[187160]: 2025-12-05 12:58:23.333 187164 DEBUG nova.virt.libvirt.driver [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:58:23 np0005546954 nova_compute[187160]: 2025-12-05 12:58:23.334 187164 DEBUG nova.virt.libvirt.driver [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:58:23 np0005546954 nova_compute[187160]: 2025-12-05 12:58:23.334 187164 DEBUG nova.virt.libvirt.driver [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:58:23 np0005546954 nova_compute[187160]: 2025-12-05 12:58:23.334 187164 DEBUG nova.virt.libvirt.driver [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:58:23 np0005546954 nova_compute[187160]: 2025-12-05 12:58:23.344 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:58:23 np0005546954 nova_compute[187160]: 2025-12-05 12:58:23.344 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764939503.3054788, 4287ea95-6b7c-4583-ba8c-a9fdca606587 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:58:23 np0005546954 nova_compute[187160]: 2025-12-05 12:58:23.344 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:58:23 np0005546954 nova_compute[187160]: 2025-12-05 12:58:23.372 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:58:23 np0005546954 nova_compute[187160]: 2025-12-05 12:58:23.375 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764939503.3106256, 4287ea95-6b7c-4583-ba8c-a9fdca606587 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:58:23 np0005546954 nova_compute[187160]: 2025-12-05 12:58:23.375 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:58:23 np0005546954 nova_compute[187160]: 2025-12-05 12:58:23.390 187164 INFO nova.compute.manager [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Took 4.94 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:58:23 np0005546954 nova_compute[187160]: 2025-12-05 12:58:23.391 187164 DEBUG nova.compute.manager [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:58:23 np0005546954 nova_compute[187160]: 2025-12-05 12:58:23.392 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:58:23 np0005546954 nova_compute[187160]: 2025-12-05 12:58:23.396 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:58:23 np0005546954 podman[214709]: 2025-12-05 12:58:23.418914558 +0000 UTC m=+0.050931949 container create 232edc7b453d02761f246ea18c636f6b450d6769bdf0a4fbc3d7012f74fe7550 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:58:23 np0005546954 nova_compute[187160]: 2025-12-05 12:58:23.450 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:58:23 np0005546954 systemd[1]: Started libpod-conmon-232edc7b453d02761f246ea18c636f6b450d6769bdf0a4fbc3d7012f74fe7550.scope.
Dec  5 07:58:23 np0005546954 nova_compute[187160]: 2025-12-05 12:58:23.481 187164 INFO nova.compute.manager [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Took 5.40 seconds to build instance.#033[00m
Dec  5 07:58:23 np0005546954 podman[214709]: 2025-12-05 12:58:23.389511812 +0000 UTC m=+0.021529223 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:58:23 np0005546954 systemd[1]: Started libcrun container.
Dec  5 07:58:23 np0005546954 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/40d28398048d3be33914f9ee9d5fcbf16984d882fe15f83a1d3b5c090aa553cf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:58:23 np0005546954 nova_compute[187160]: 2025-12-05 12:58:23.498 187164 DEBUG oslo_concurrency.lockutils [None req-7a75fe08-3cac-4715-8d96-3722bfb94a66 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "4287ea95-6b7c-4583-ba8c-a9fdca606587" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.491s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:58:23 np0005546954 podman[214709]: 2025-12-05 12:58:23.504631731 +0000 UTC m=+0.136649132 container init 232edc7b453d02761f246ea18c636f6b450d6769bdf0a4fbc3d7012f74fe7550 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:58:23 np0005546954 podman[214709]: 2025-12-05 12:58:23.51262929 +0000 UTC m=+0.144646681 container start 232edc7b453d02761f246ea18c636f6b450d6769bdf0a4fbc3d7012f74fe7550 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec  5 07:58:23 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[214725]: [NOTICE]   (214729) : New worker (214731) forked
Dec  5 07:58:23 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[214725]: [NOTICE]   (214729) : Loading success.
Dec  5 07:58:25 np0005546954 nova_compute[187160]: 2025-12-05 12:58:25.117 187164 DEBUG nova.compute.manager [req-4839b176-180a-491d-bf83-a4d2d544c030 req-06f1d3a8-acc3-494a-931b-854c4f6c01dd 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Received event network-vif-plugged-387def1f-1379-4223-bac4-15b131c92566 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:58:25 np0005546954 nova_compute[187160]: 2025-12-05 12:58:25.117 187164 DEBUG oslo_concurrency.lockutils [req-4839b176-180a-491d-bf83-a4d2d544c030 req-06f1d3a8-acc3-494a-931b-854c4f6c01dd 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "4287ea95-6b7c-4583-ba8c-a9fdca606587-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:58:25 np0005546954 nova_compute[187160]: 2025-12-05 12:58:25.118 187164 DEBUG oslo_concurrency.lockutils [req-4839b176-180a-491d-bf83-a4d2d544c030 req-06f1d3a8-acc3-494a-931b-854c4f6c01dd 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "4287ea95-6b7c-4583-ba8c-a9fdca606587-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:58:25 np0005546954 nova_compute[187160]: 2025-12-05 12:58:25.118 187164 DEBUG oslo_concurrency.lockutils [req-4839b176-180a-491d-bf83-a4d2d544c030 req-06f1d3a8-acc3-494a-931b-854c4f6c01dd 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "4287ea95-6b7c-4583-ba8c-a9fdca606587-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:58:25 np0005546954 nova_compute[187160]: 2025-12-05 12:58:25.118 187164 DEBUG nova.compute.manager [req-4839b176-180a-491d-bf83-a4d2d544c030 req-06f1d3a8-acc3-494a-931b-854c4f6c01dd 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] No waiting events found dispatching network-vif-plugged-387def1f-1379-4223-bac4-15b131c92566 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:58:25 np0005546954 nova_compute[187160]: 2025-12-05 12:58:25.118 187164 WARNING nova.compute.manager [req-4839b176-180a-491d-bf83-a4d2d544c030 req-06f1d3a8-acc3-494a-931b-854c4f6c01dd 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Received unexpected event network-vif-plugged-387def1f-1379-4223-bac4-15b131c92566 for instance with vm_state active and task_state None.#033[00m
Dec  5 07:58:25 np0005546954 nova_compute[187160]: 2025-12-05 12:58:25.662 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:58:27 np0005546954 nova_compute[187160]: 2025-12-05 12:58:27.128 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:58:30 np0005546954 nova_compute[187160]: 2025-12-05 12:58:30.665 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:58:31 np0005546954 podman[214740]: 2025-12-05 12:58:31.565067275 +0000 UTC m=+0.070200589 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  5 07:58:32 np0005546954 nova_compute[187160]: 2025-12-05 12:58:32.130 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:58:34 np0005546954 ovn_controller[95566]: 2025-12-05T12:58:34Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:2f:77:70 10.100.0.13
Dec  5 07:58:34 np0005546954 ovn_controller[95566]: 2025-12-05T12:58:34Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2f:77:70 10.100.0.13
Dec  5 07:58:35 np0005546954 podman[197513]: time="2025-12-05T12:58:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:58:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:58:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  5 07:58:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:58:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3060 "" "Go-http-client/1.1"
Dec  5 07:58:35 np0005546954 nova_compute[187160]: 2025-12-05 12:58:35.666 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:58:36 np0005546954 podman[214778]: 2025-12-05 12:58:36.561566496 +0000 UTC m=+0.064322436 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  5 07:58:36 np0005546954 podman[214777]: 2025-12-05 12:58:36.632354812 +0000 UTC m=+0.130413276 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  5 07:58:37 np0005546954 nova_compute[187160]: 2025-12-05 12:58:37.132 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:58:40 np0005546954 nova_compute[187160]: 2025-12-05 12:58:40.668 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:58:42 np0005546954 nova_compute[187160]: 2025-12-05 12:58:42.134 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:58:45 np0005546954 nova_compute[187160]: 2025-12-05 12:58:45.670 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:58:47 np0005546954 nova_compute[187160]: 2025-12-05 12:58:47.137 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:58:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:58:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:58:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:58:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 07:58:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:58:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:58:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:58:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 07:58:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:58:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:58:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 07:58:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:58:49 np0005546954 podman[214829]: 2025-12-05 12:58:49.561014328 +0000 UTC m=+0.069980843 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, config_id=edpm, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, release=1755695350, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec  5 07:58:49 np0005546954 podman[214830]: 2025-12-05 12:58:49.574740496 +0000 UTC m=+0.067099043 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2)
Dec  5 07:58:50 np0005546954 nova_compute[187160]: 2025-12-05 12:58:50.672 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:58:52 np0005546954 nova_compute[187160]: 2025-12-05 12:58:52.140 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:58:52 np0005546954 ovn_controller[95566]: 2025-12-05T12:58:52Z|00175|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Dec  5 07:58:55 np0005546954 nova_compute[187160]: 2025-12-05 12:58:55.673 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:58:57 np0005546954 nova_compute[187160]: 2025-12-05 12:58:57.194 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:00 np0005546954 nova_compute[187160]: 2025-12-05 12:59:00.675 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:02 np0005546954 nova_compute[187160]: 2025-12-05 12:59:02.234 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:02 np0005546954 podman[214870]: 2025-12-05 12:59:02.577019758 +0000 UTC m=+0.078958133 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  5 07:59:03 np0005546954 nova_compute[187160]: 2025-12-05 12:59:03.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:59:04 np0005546954 nova_compute[187160]: 2025-12-05 12:59:04.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:59:04 np0005546954 nova_compute[187160]: 2025-12-05 12:59:04.038 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:59:04 np0005546954 nova_compute[187160]: 2025-12-05 12:59:04.039 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:59:04 np0005546954 nova_compute[187160]: 2025-12-05 12:59:04.713 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "refresh_cache-4287ea95-6b7c-4583-ba8c-a9fdca606587" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:59:04 np0005546954 nova_compute[187160]: 2025-12-05 12:59:04.714 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquired lock "refresh_cache-4287ea95-6b7c-4583-ba8c-a9fdca606587" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:59:04 np0005546954 nova_compute[187160]: 2025-12-05 12:59:04.714 187164 DEBUG nova.network.neutron [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec  5 07:59:04 np0005546954 nova_compute[187160]: 2025-12-05 12:59:04.714 187164 DEBUG nova.objects.instance [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4287ea95-6b7c-4583-ba8c-a9fdca606587 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:59:05 np0005546954 podman[197513]: time="2025-12-05T12:59:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:59:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:59:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  5 07:59:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:59:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3049 "" "Go-http-client/1.1"
Dec  5 07:59:05 np0005546954 nova_compute[187160]: 2025-12-05 12:59:05.676 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:06 np0005546954 nova_compute[187160]: 2025-12-05 12:59:06.933 187164 DEBUG nova.network.neutron [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Updating instance_info_cache with network_info: [{"id": "387def1f-1379-4223-bac4-15b131c92566", "address": "fa:16:3e:2f:77:70", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap387def1f-13", "ovs_interfaceid": "387def1f-1379-4223-bac4-15b131c92566", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:59:06 np0005546954 nova_compute[187160]: 2025-12-05 12:59:06.965 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Releasing lock "refresh_cache-4287ea95-6b7c-4583-ba8c-a9fdca606587" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:59:06 np0005546954 nova_compute[187160]: 2025-12-05 12:59:06.965 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec  5 07:59:06 np0005546954 nova_compute[187160]: 2025-12-05 12:59:06.966 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:59:07 np0005546954 nova_compute[187160]: 2025-12-05 12:59:07.289 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:07 np0005546954 podman[214890]: 2025-12-05 12:59:07.580363092 +0000 UTC m=+0.075127813 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 07:59:07 np0005546954 podman[214889]: 2025-12-05 12:59:07.618725537 +0000 UTC m=+0.118043531 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  5 07:59:08 np0005546954 nova_compute[187160]: 2025-12-05 12:59:08.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:59:09 np0005546954 nova_compute[187160]: 2025-12-05 12:59:09.034 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:59:09 np0005546954 nova_compute[187160]: 2025-12-05 12:59:09.037 187164 DEBUG nova.virt.libvirt.driver [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a3590805-f796-4ac1-9051-0976e21b76dd] Creating tmpfile /var/lib/nova/instances/tmpn0vu7xsb to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Dec  5 07:59:09 np0005546954 nova_compute[187160]: 2025-12-05 12:59:09.039 187164 DEBUG nova.compute.manager [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpn0vu7xsb',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Dec  5 07:59:10 np0005546954 nova_compute[187160]: 2025-12-05 12:59:10.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:59:10 np0005546954 nova_compute[187160]: 2025-12-05 12:59:10.134 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:59:10 np0005546954 nova_compute[187160]: 2025-12-05 12:59:10.134 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:59:10 np0005546954 nova_compute[187160]: 2025-12-05 12:59:10.134 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:59:10 np0005546954 nova_compute[187160]: 2025-12-05 12:59:10.135 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:59:10 np0005546954 nova_compute[187160]: 2025-12-05 12:59:10.229 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4287ea95-6b7c-4583-ba8c-a9fdca606587/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:59:10 np0005546954 nova_compute[187160]: 2025-12-05 12:59:10.286 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4287ea95-6b7c-4583-ba8c-a9fdca606587/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:59:10 np0005546954 nova_compute[187160]: 2025-12-05 12:59:10.288 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4287ea95-6b7c-4583-ba8c-a9fdca606587/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:59:10 np0005546954 nova_compute[187160]: 2025-12-05 12:59:10.343 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4287ea95-6b7c-4583-ba8c-a9fdca606587/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:59:10 np0005546954 nova_compute[187160]: 2025-12-05 12:59:10.510 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:59:10 np0005546954 nova_compute[187160]: 2025-12-05 12:59:10.512 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5717MB free_disk=73.30755233764648GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:59:10 np0005546954 nova_compute[187160]: 2025-12-05 12:59:10.512 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:59:10 np0005546954 nova_compute[187160]: 2025-12-05 12:59:10.512 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:59:10 np0005546954 nova_compute[187160]: 2025-12-05 12:59:10.587 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Migration for instance a3590805-f796-4ac1-9051-0976e21b76dd refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Dec  5 07:59:10 np0005546954 nova_compute[187160]: 2025-12-05 12:59:10.603 187164 DEBUG nova.compute.manager [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpn0vu7xsb',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='a3590805-f796-4ac1-9051-0976e21b76dd',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Dec  5 07:59:10 np0005546954 nova_compute[187160]: 2025-12-05 12:59:10.623 187164 INFO nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: a3590805-f796-4ac1-9051-0976e21b76dd] Updating resource usage from migration 25ff8907-e16a-4270-9e1f-0be10efe3bcd#033[00m
Dec  5 07:59:10 np0005546954 nova_compute[187160]: 2025-12-05 12:59:10.624 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: a3590805-f796-4ac1-9051-0976e21b76dd] Starting to track incoming migration 25ff8907-e16a-4270-9e1f-0be10efe3bcd with flavor b4ea63be-97f8-4a48-b000-66321c4ddb27 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Dec  5 07:59:10 np0005546954 nova_compute[187160]: 2025-12-05 12:59:10.669 187164 DEBUG oslo_concurrency.lockutils [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "refresh_cache-a3590805-f796-4ac1-9051-0976e21b76dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:59:10 np0005546954 nova_compute[187160]: 2025-12-05 12:59:10.670 187164 DEBUG oslo_concurrency.lockutils [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquired lock "refresh_cache-a3590805-f796-4ac1-9051-0976e21b76dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:59:10 np0005546954 nova_compute[187160]: 2025-12-05 12:59:10.670 187164 DEBUG nova.network.neutron [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a3590805-f796-4ac1-9051-0976e21b76dd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:59:10 np0005546954 nova_compute[187160]: 2025-12-05 12:59:10.678 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:10 np0005546954 nova_compute[187160]: 2025-12-05 12:59:10.697 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Instance 4287ea95-6b7c-4583-ba8c-a9fdca606587 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:59:10 np0005546954 nova_compute[187160]: 2025-12-05 12:59:10.716 187164 WARNING nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Instance a3590805-f796-4ac1-9051-0976e21b76dd has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.#033[00m
Dec  5 07:59:10 np0005546954 nova_compute[187160]: 2025-12-05 12:59:10.716 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:59:10 np0005546954 nova_compute[187160]: 2025-12-05 12:59:10.716 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:59:10 np0005546954 nova_compute[187160]: 2025-12-05 12:59:10.732 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Refreshing inventories for resource provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  5 07:59:10 np0005546954 nova_compute[187160]: 2025-12-05 12:59:10.751 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Updating ProviderTree inventory for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  5 07:59:10 np0005546954 nova_compute[187160]: 2025-12-05 12:59:10.751 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Updating inventory in ProviderTree for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  5 07:59:10 np0005546954 nova_compute[187160]: 2025-12-05 12:59:10.773 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Refreshing aggregate associations for resource provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  5 07:59:10 np0005546954 nova_compute[187160]: 2025-12-05 12:59:10.807 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Refreshing trait associations for resource provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b, traits: COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_IDE,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_2_0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  5 07:59:10 np0005546954 nova_compute[187160]: 2025-12-05 12:59:10.855 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:59:10 np0005546954 nova_compute[187160]: 2025-12-05 12:59:10.871 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:59:10 np0005546954 nova_compute[187160]: 2025-12-05 12:59:10.888 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:59:10 np0005546954 nova_compute[187160]: 2025-12-05 12:59:10.888 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.376s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:59:11 np0005546954 nova_compute[187160]: 2025-12-05 12:59:11.805 187164 DEBUG nova.network.neutron [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a3590805-f796-4ac1-9051-0976e21b76dd] Updating instance_info_cache with network_info: [{"id": "80b9305c-89c9-4475-8764-be7039f28636", "address": "fa:16:3e:b9:65:c0", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap80b9305c-89", "ovs_interfaceid": "80b9305c-89c9-4475-8764-be7039f28636", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:59:11 np0005546954 nova_compute[187160]: 2025-12-05 12:59:11.822 187164 DEBUG oslo_concurrency.lockutils [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Releasing lock "refresh_cache-a3590805-f796-4ac1-9051-0976e21b76dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:59:11 np0005546954 nova_compute[187160]: 2025-12-05 12:59:11.824 187164 DEBUG nova.virt.libvirt.driver [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a3590805-f796-4ac1-9051-0976e21b76dd] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpn0vu7xsb',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='a3590805-f796-4ac1-9051-0976e21b76dd',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Dec  5 07:59:11 np0005546954 nova_compute[187160]: 2025-12-05 12:59:11.824 187164 DEBUG nova.virt.libvirt.driver [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a3590805-f796-4ac1-9051-0976e21b76dd] Creating instance directory: /var/lib/nova/instances/a3590805-f796-4ac1-9051-0976e21b76dd pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Dec  5 07:59:11 np0005546954 nova_compute[187160]: 2025-12-05 12:59:11.825 187164 DEBUG nova.virt.libvirt.driver [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a3590805-f796-4ac1-9051-0976e21b76dd] Creating disk.info with the contents: {'/var/lib/nova/instances/a3590805-f796-4ac1-9051-0976e21b76dd/disk': 'qcow2', '/var/lib/nova/instances/a3590805-f796-4ac1-9051-0976e21b76dd/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Dec  5 07:59:11 np0005546954 nova_compute[187160]: 2025-12-05 12:59:11.825 187164 DEBUG nova.virt.libvirt.driver [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a3590805-f796-4ac1-9051-0976e21b76dd] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Dec  5 07:59:11 np0005546954 nova_compute[187160]: 2025-12-05 12:59:11.826 187164 DEBUG nova.objects.instance [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lazy-loading 'trusted_certs' on Instance uuid a3590805-f796-4ac1-9051-0976e21b76dd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:59:11 np0005546954 nova_compute[187160]: 2025-12-05 12:59:11.853 187164 DEBUG oslo_concurrency.processutils [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:59:11 np0005546954 nova_compute[187160]: 2025-12-05 12:59:11.941 187164 DEBUG oslo_concurrency.processutils [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:59:11 np0005546954 nova_compute[187160]: 2025-12-05 12:59:11.942 187164 DEBUG oslo_concurrency.lockutils [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:59:11 np0005546954 nova_compute[187160]: 2025-12-05 12:59:11.943 187164 DEBUG oslo_concurrency.lockutils [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:59:11 np0005546954 nova_compute[187160]: 2025-12-05 12:59:11.958 187164 DEBUG oslo_concurrency.processutils [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:59:12 np0005546954 nova_compute[187160]: 2025-12-05 12:59:12.052 187164 DEBUG oslo_concurrency.processutils [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:59:12 np0005546954 nova_compute[187160]: 2025-12-05 12:59:12.054 187164 DEBUG oslo_concurrency.processutils [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/a3590805-f796-4ac1-9051-0976e21b76dd/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:59:12 np0005546954 nova_compute[187160]: 2025-12-05 12:59:12.097 187164 DEBUG oslo_concurrency.processutils [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/a3590805-f796-4ac1-9051-0976e21b76dd/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:59:12 np0005546954 nova_compute[187160]: 2025-12-05 12:59:12.098 187164 DEBUG oslo_concurrency.lockutils [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:59:12 np0005546954 nova_compute[187160]: 2025-12-05 12:59:12.099 187164 DEBUG oslo_concurrency.processutils [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:59:12 np0005546954 nova_compute[187160]: 2025-12-05 12:59:12.171 187164 DEBUG oslo_concurrency.processutils [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:59:12 np0005546954 nova_compute[187160]: 2025-12-05 12:59:12.173 187164 DEBUG nova.virt.disk.api [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Checking if we can resize image /var/lib/nova/instances/a3590805-f796-4ac1-9051-0976e21b76dd/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:59:12 np0005546954 nova_compute[187160]: 2025-12-05 12:59:12.174 187164 DEBUG oslo_concurrency.processutils [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a3590805-f796-4ac1-9051-0976e21b76dd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:59:12 np0005546954 nova_compute[187160]: 2025-12-05 12:59:12.244 187164 DEBUG oslo_concurrency.processutils [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a3590805-f796-4ac1-9051-0976e21b76dd/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:59:12 np0005546954 nova_compute[187160]: 2025-12-05 12:59:12.246 187164 DEBUG nova.virt.disk.api [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Cannot resize image /var/lib/nova/instances/a3590805-f796-4ac1-9051-0976e21b76dd/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:59:12 np0005546954 nova_compute[187160]: 2025-12-05 12:59:12.247 187164 DEBUG nova.objects.instance [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lazy-loading 'migration_context' on Instance uuid a3590805-f796-4ac1-9051-0976e21b76dd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:59:12 np0005546954 nova_compute[187160]: 2025-12-05 12:59:12.261 187164 DEBUG oslo_concurrency.processutils [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/a3590805-f796-4ac1-9051-0976e21b76dd/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:59:12 np0005546954 nova_compute[187160]: 2025-12-05 12:59:12.287 187164 DEBUG oslo_concurrency.processutils [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/a3590805-f796-4ac1-9051-0976e21b76dd/disk.config 485376" returned: 0 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:59:12 np0005546954 nova_compute[187160]: 2025-12-05 12:59:12.288 187164 DEBUG nova.virt.libvirt.volume.remotefs [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/a3590805-f796-4ac1-9051-0976e21b76dd/disk.config to /var/lib/nova/instances/a3590805-f796-4ac1-9051-0976e21b76dd copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Dec  5 07:59:12 np0005546954 nova_compute[187160]: 2025-12-05 12:59:12.288 187164 DEBUG oslo_concurrency.processutils [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/a3590805-f796-4ac1-9051-0976e21b76dd/disk.config /var/lib/nova/instances/a3590805-f796-4ac1-9051-0976e21b76dd execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:59:12 np0005546954 nova_compute[187160]: 2025-12-05 12:59:12.327 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:12 np0005546954 nova_compute[187160]: 2025-12-05 12:59:12.802 187164 DEBUG oslo_concurrency.processutils [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/a3590805-f796-4ac1-9051-0976e21b76dd/disk.config /var/lib/nova/instances/a3590805-f796-4ac1-9051-0976e21b76dd" returned: 0 in 0.514s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:59:12 np0005546954 nova_compute[187160]: 2025-12-05 12:59:12.803 187164 DEBUG nova.virt.libvirt.driver [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a3590805-f796-4ac1-9051-0976e21b76dd] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Dec  5 07:59:12 np0005546954 nova_compute[187160]: 2025-12-05 12:59:12.805 187164 DEBUG nova.virt.libvirt.vif [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:57:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1601943028',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1601943028',id=17,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:58:10Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e6ae0d0dcde04b85b6dae45560cca988',ramdisk_id='',reservation_id='r-8ld7dxoq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-192029678',owner_user_name='tempest-TestExecuteStrategies-192029678-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:58:10Z,user_data=None,user_id='0ae0bb20ac8b4be99eb1abddc7310436',uuid=a3590805-f796-4ac1-9051-0976e21b76dd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "80b9305c-89c9-4475-8764-be7039f28636", "address": "fa:16:3e:b9:65:c0", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap80b9305c-89", "ovs_interfaceid": "80b9305c-89c9-4475-8764-be7039f28636", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:59:12 np0005546954 nova_compute[187160]: 2025-12-05 12:59:12.806 187164 DEBUG nova.network.os_vif_util [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Converting VIF {"id": "80b9305c-89c9-4475-8764-be7039f28636", "address": "fa:16:3e:b9:65:c0", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap80b9305c-89", "ovs_interfaceid": "80b9305c-89c9-4475-8764-be7039f28636", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:59:12 np0005546954 nova_compute[187160]: 2025-12-05 12:59:12.807 187164 DEBUG nova.network.os_vif_util [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b9:65:c0,bridge_name='br-int',has_traffic_filtering=True,id=80b9305c-89c9-4475-8764-be7039f28636,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap80b9305c-89') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:59:12 np0005546954 nova_compute[187160]: 2025-12-05 12:59:12.808 187164 DEBUG os_vif [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:65:c0,bridge_name='br-int',has_traffic_filtering=True,id=80b9305c-89c9-4475-8764-be7039f28636,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap80b9305c-89') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:59:12 np0005546954 nova_compute[187160]: 2025-12-05 12:59:12.809 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:12 np0005546954 nova_compute[187160]: 2025-12-05 12:59:12.809 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:59:12 np0005546954 nova_compute[187160]: 2025-12-05 12:59:12.810 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:59:12 np0005546954 nova_compute[187160]: 2025-12-05 12:59:12.815 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:12 np0005546954 nova_compute[187160]: 2025-12-05 12:59:12.816 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap80b9305c-89, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:59:12 np0005546954 nova_compute[187160]: 2025-12-05 12:59:12.817 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap80b9305c-89, col_values=(('external_ids', {'iface-id': '80b9305c-89c9-4475-8764-be7039f28636', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b9:65:c0', 'vm-uuid': 'a3590805-f796-4ac1-9051-0976e21b76dd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:59:12 np0005546954 nova_compute[187160]: 2025-12-05 12:59:12.818 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:12 np0005546954 NetworkManager[55665]: <info>  [1764939552.8197] manager: (tap80b9305c-89): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/72)
Dec  5 07:59:12 np0005546954 nova_compute[187160]: 2025-12-05 12:59:12.823 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:59:12 np0005546954 nova_compute[187160]: 2025-12-05 12:59:12.826 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:12 np0005546954 nova_compute[187160]: 2025-12-05 12:59:12.829 187164 INFO os_vif [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:65:c0,bridge_name='br-int',has_traffic_filtering=True,id=80b9305c-89c9-4475-8764-be7039f28636,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap80b9305c-89')#033[00m
Dec  5 07:59:12 np0005546954 nova_compute[187160]: 2025-12-05 12:59:12.830 187164 DEBUG nova.virt.libvirt.driver [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Dec  5 07:59:12 np0005546954 nova_compute[187160]: 2025-12-05 12:59:12.830 187164 DEBUG nova.compute.manager [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpn0vu7xsb',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='a3590805-f796-4ac1-9051-0976e21b76dd',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Dec  5 07:59:12 np0005546954 nova_compute[187160]: 2025-12-05 12:59:12.889 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:59:13 np0005546954 nova_compute[187160]: 2025-12-05 12:59:13.726 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:13 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:13.726 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2a:56:46', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '32:90:88:ab:74:32'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:59:13 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:13.729 104428 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 07:59:14 np0005546954 nova_compute[187160]: 2025-12-05 12:59:14.018 187164 DEBUG nova.network.neutron [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a3590805-f796-4ac1-9051-0976e21b76dd] Port 80b9305c-89c9-4475-8764-be7039f28636 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Dec  5 07:59:14 np0005546954 nova_compute[187160]: 2025-12-05 12:59:14.019 187164 DEBUG nova.compute.manager [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpn0vu7xsb',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='a3590805-f796-4ac1-9051-0976e21b76dd',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Dec  5 07:59:14 np0005546954 nova_compute[187160]: 2025-12-05 12:59:14.034 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:59:14 np0005546954 kernel: tap80b9305c-89: entered promiscuous mode
Dec  5 07:59:14 np0005546954 NetworkManager[55665]: <info>  [1764939554.3066] manager: (tap80b9305c-89): new Tun device (/org/freedesktop/NetworkManager/Devices/73)
Dec  5 07:59:14 np0005546954 ovn_controller[95566]: 2025-12-05T12:59:14Z|00176|binding|INFO|Claiming lport 80b9305c-89c9-4475-8764-be7039f28636 for this additional chassis.
Dec  5 07:59:14 np0005546954 ovn_controller[95566]: 2025-12-05T12:59:14Z|00177|binding|INFO|80b9305c-89c9-4475-8764-be7039f28636: Claiming fa:16:3e:b9:65:c0 10.100.0.14
Dec  5 07:59:14 np0005546954 nova_compute[187160]: 2025-12-05 12:59:14.308 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:14 np0005546954 ovn_controller[95566]: 2025-12-05T12:59:14Z|00178|binding|INFO|Setting lport 80b9305c-89c9-4475-8764-be7039f28636 ovn-installed in OVS
Dec  5 07:59:14 np0005546954 nova_compute[187160]: 2025-12-05 12:59:14.338 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:14 np0005546954 nova_compute[187160]: 2025-12-05 12:59:14.343 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:14 np0005546954 systemd-udevd[214980]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:59:14 np0005546954 systemd-machined[153497]: New machine qemu-17-instance-00000011.
Dec  5 07:59:14 np0005546954 systemd[1]: Started Virtual Machine qemu-17-instance-00000011.
Dec  5 07:59:14 np0005546954 NetworkManager[55665]: <info>  [1764939554.3839] device (tap80b9305c-89): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:59:14 np0005546954 NetworkManager[55665]: <info>  [1764939554.3865] device (tap80b9305c-89): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:59:15 np0005546954 nova_compute[187160]: 2025-12-05 12:59:15.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:59:15 np0005546954 nova_compute[187160]: 2025-12-05 12:59:15.039 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:59:15 np0005546954 nova_compute[187160]: 2025-12-05 12:59:15.453 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764939555.4525595, a3590805-f796-4ac1-9051-0976e21b76dd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:59:15 np0005546954 nova_compute[187160]: 2025-12-05 12:59:15.453 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: a3590805-f796-4ac1-9051-0976e21b76dd] VM Started (Lifecycle Event)#033[00m
Dec  5 07:59:15 np0005546954 nova_compute[187160]: 2025-12-05 12:59:15.634 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: a3590805-f796-4ac1-9051-0976e21b76dd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:59:15 np0005546954 nova_compute[187160]: 2025-12-05 12:59:15.680 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:16 np0005546954 nova_compute[187160]: 2025-12-05 12:59:16.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:59:16 np0005546954 nova_compute[187160]: 2025-12-05 12:59:16.658 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764939556.6576807, a3590805-f796-4ac1-9051-0976e21b76dd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:59:16 np0005546954 nova_compute[187160]: 2025-12-05 12:59:16.658 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: a3590805-f796-4ac1-9051-0976e21b76dd] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:59:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:16.733 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f9f74c-08f9-451f-9678-93bb9e8fa2fe, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:59:16 np0005546954 nova_compute[187160]: 2025-12-05 12:59:16.819 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: a3590805-f796-4ac1-9051-0976e21b76dd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:59:16 np0005546954 nova_compute[187160]: 2025-12-05 12:59:16.823 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: a3590805-f796-4ac1-9051-0976e21b76dd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:59:16 np0005546954 nova_compute[187160]: 2025-12-05 12:59:16.865 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: a3590805-f796-4ac1-9051-0976e21b76dd] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Dec  5 07:59:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:16.959 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:59:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:16.960 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:59:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:16.962 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:59:17 np0005546954 nova_compute[187160]: 2025-12-05 12:59:17.821 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:19 np0005546954 ovn_controller[95566]: 2025-12-05T12:59:19Z|00179|binding|INFO|Claiming lport 80b9305c-89c9-4475-8764-be7039f28636 for this chassis.
Dec  5 07:59:19 np0005546954 ovn_controller[95566]: 2025-12-05T12:59:19Z|00180|binding|INFO|80b9305c-89c9-4475-8764-be7039f28636: Claiming fa:16:3e:b9:65:c0 10.100.0.14
Dec  5 07:59:19 np0005546954 ovn_controller[95566]: 2025-12-05T12:59:19Z|00181|binding|INFO|Setting lport 80b9305c-89c9-4475-8764-be7039f28636 up in Southbound
Dec  5 07:59:19 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:19.318 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b9:65:c0 10.100.0.14'], port_security=['fa:16:3e:b9:65:c0 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'a3590805-f796-4ac1-9051-0976e21b76dd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4389bc8-2898-48b0-9741-5183b54fe83c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e6ae0d0dcde04b85b6dae45560cca988', 'neutron:revision_number': '11', 'neutron:security_group_ids': '9ea68f98-ae7c-4c35-bc5a-7c1a27f7e5f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb60c317-acba-4c06-b29b-f7c6c7a5660a, chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=80b9305c-89c9-4475-8764-be7039f28636) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:59:19 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:19.320 104428 INFO neutron.agent.ovn.metadata.agent [-] Port 80b9305c-89c9-4475-8764-be7039f28636 in datapath d4389bc8-2898-48b0-9741-5183b54fe83c bound to our chassis#033[00m
Dec  5 07:59:19 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:19.322 104428 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d4389bc8-2898-48b0-9741-5183b54fe83c#033[00m
Dec  5 07:59:19 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:19.341 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[ae782de4-6bb8-42c3-b813-4f518d28ed8d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:59:19 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:19.378 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[67d690c5-b07c-4d29-848a-0e9fff131b94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:59:19 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:19.382 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[109a35ba-092f-4f88-a459-a24f80d46a92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:59:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:59:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 07:59:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:59:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:59:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:59:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:59:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:59:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 07:59:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:59:19 np0005546954 openstack_network_exporter[199661]: ERROR   12:59:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 07:59:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:59:19 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:19.427 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[7381127d-08d8-43a8-850b-a51b705a8e2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:59:19 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:19.451 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[4477dd87-8d8d-42ca-96ac-41abd8998cf4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd4389bc8-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:43:f7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 48], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 450948, 'reachable_time': 27184, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215009, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:59:19 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:19.474 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[73e70b15-4791-448c-8d7f-f421ccdc0432]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd4389bc8-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 450959, 'tstamp': 450959}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215010, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd4389bc8-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 450963, 'tstamp': 450963}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215010, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:59:19 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:19.476 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4389bc8-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:59:19 np0005546954 nova_compute[187160]: 2025-12-05 12:59:19.478 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:19 np0005546954 nova_compute[187160]: 2025-12-05 12:59:19.480 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:19 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:19.480 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd4389bc8-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:59:19 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:19.481 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:59:19 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:19.481 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd4389bc8-20, col_values=(('external_ids', {'iface-id': '8dbe2af5-9f18-44ca-8f22-66854bcdd596'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:59:19 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:19.482 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:59:19 np0005546954 nova_compute[187160]: 2025-12-05 12:59:19.541 187164 INFO nova.compute.manager [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a3590805-f796-4ac1-9051-0976e21b76dd] Post operation of migration started#033[00m
Dec  5 07:59:19 np0005546954 nova_compute[187160]: 2025-12-05 12:59:19.838 187164 DEBUG oslo_concurrency.lockutils [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "refresh_cache-a3590805-f796-4ac1-9051-0976e21b76dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:59:19 np0005546954 nova_compute[187160]: 2025-12-05 12:59:19.839 187164 DEBUG oslo_concurrency.lockutils [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquired lock "refresh_cache-a3590805-f796-4ac1-9051-0976e21b76dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:59:19 np0005546954 nova_compute[187160]: 2025-12-05 12:59:19.839 187164 DEBUG nova.network.neutron [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a3590805-f796-4ac1-9051-0976e21b76dd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:59:20 np0005546954 podman[215012]: 2025-12-05 12:59:20.569144441 +0000 UTC m=+0.079861830 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  5 07:59:20 np0005546954 podman[215011]: 2025-12-05 12:59:20.569604816 +0000 UTC m=+0.080749559 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2025-08-20T13:12:41, config_id=edpm, container_name=openstack_network_exporter, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, maintainer=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec  5 07:59:20 np0005546954 nova_compute[187160]: 2025-12-05 12:59:20.682 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:21 np0005546954 nova_compute[187160]: 2025-12-05 12:59:21.174 187164 DEBUG nova.network.neutron [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a3590805-f796-4ac1-9051-0976e21b76dd] Updating instance_info_cache with network_info: [{"id": "80b9305c-89c9-4475-8764-be7039f28636", "address": "fa:16:3e:b9:65:c0", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap80b9305c-89", "ovs_interfaceid": "80b9305c-89c9-4475-8764-be7039f28636", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:59:21 np0005546954 nova_compute[187160]: 2025-12-05 12:59:21.211 187164 DEBUG oslo_concurrency.lockutils [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Releasing lock "refresh_cache-a3590805-f796-4ac1-9051-0976e21b76dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:59:21 np0005546954 nova_compute[187160]: 2025-12-05 12:59:21.231 187164 DEBUG oslo_concurrency.lockutils [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:59:21 np0005546954 nova_compute[187160]: 2025-12-05 12:59:21.231 187164 DEBUG oslo_concurrency.lockutils [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:59:21 np0005546954 nova_compute[187160]: 2025-12-05 12:59:21.232 187164 DEBUG oslo_concurrency.lockutils [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:59:21 np0005546954 nova_compute[187160]: 2025-12-05 12:59:21.236 187164 INFO nova.virt.libvirt.driver [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a3590805-f796-4ac1-9051-0976e21b76dd] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Dec  5 07:59:21 np0005546954 virtqemud[186730]: Domain id=17 name='instance-00000011' uuid=a3590805-f796-4ac1-9051-0976e21b76dd is tainted: custom-monitor
Dec  5 07:59:22 np0005546954 nova_compute[187160]: 2025-12-05 12:59:22.244 187164 INFO nova.virt.libvirt.driver [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a3590805-f796-4ac1-9051-0976e21b76dd] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Dec  5 07:59:22 np0005546954 nova_compute[187160]: 2025-12-05 12:59:22.824 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:23 np0005546954 nova_compute[187160]: 2025-12-05 12:59:23.253 187164 INFO nova.virt.libvirt.driver [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a3590805-f796-4ac1-9051-0976e21b76dd] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Dec  5 07:59:23 np0005546954 nova_compute[187160]: 2025-12-05 12:59:23.259 187164 DEBUG nova.compute.manager [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a3590805-f796-4ac1-9051-0976e21b76dd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:59:23 np0005546954 nova_compute[187160]: 2025-12-05 12:59:23.285 187164 DEBUG nova.objects.instance [None req-b4010468-7413-40de-a602-81cc0829b4eb 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a3590805-f796-4ac1-9051-0976e21b76dd] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Dec  5 07:59:25 np0005546954 nova_compute[187160]: 2025-12-05 12:59:25.686 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:27 np0005546954 nova_compute[187160]: 2025-12-05 12:59:27.152 187164 DEBUG oslo_concurrency.lockutils [None req-2d8dd6b6-51b0-4289-bdfe-c396418b9714 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "4287ea95-6b7c-4583-ba8c-a9fdca606587" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:59:27 np0005546954 nova_compute[187160]: 2025-12-05 12:59:27.153 187164 DEBUG oslo_concurrency.lockutils [None req-2d8dd6b6-51b0-4289-bdfe-c396418b9714 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "4287ea95-6b7c-4583-ba8c-a9fdca606587" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:59:27 np0005546954 nova_compute[187160]: 2025-12-05 12:59:27.153 187164 DEBUG oslo_concurrency.lockutils [None req-2d8dd6b6-51b0-4289-bdfe-c396418b9714 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "4287ea95-6b7c-4583-ba8c-a9fdca606587-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:59:27 np0005546954 nova_compute[187160]: 2025-12-05 12:59:27.154 187164 DEBUG oslo_concurrency.lockutils [None req-2d8dd6b6-51b0-4289-bdfe-c396418b9714 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "4287ea95-6b7c-4583-ba8c-a9fdca606587-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:59:27 np0005546954 nova_compute[187160]: 2025-12-05 12:59:27.154 187164 DEBUG oslo_concurrency.lockutils [None req-2d8dd6b6-51b0-4289-bdfe-c396418b9714 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "4287ea95-6b7c-4583-ba8c-a9fdca606587-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:59:27 np0005546954 nova_compute[187160]: 2025-12-05 12:59:27.156 187164 INFO nova.compute.manager [None req-2d8dd6b6-51b0-4289-bdfe-c396418b9714 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Terminating instance#033[00m
Dec  5 07:59:27 np0005546954 nova_compute[187160]: 2025-12-05 12:59:27.158 187164 DEBUG nova.compute.manager [None req-2d8dd6b6-51b0-4289-bdfe-c396418b9714 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:59:27 np0005546954 kernel: tap387def1f-13 (unregistering): left promiscuous mode
Dec  5 07:59:27 np0005546954 NetworkManager[55665]: <info>  [1764939567.2000] device (tap387def1f-13): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:59:27 np0005546954 nova_compute[187160]: 2025-12-05 12:59:27.204 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:27 np0005546954 ovn_controller[95566]: 2025-12-05T12:59:27Z|00182|binding|INFO|Releasing lport 387def1f-1379-4223-bac4-15b131c92566 from this chassis (sb_readonly=0)
Dec  5 07:59:27 np0005546954 ovn_controller[95566]: 2025-12-05T12:59:27Z|00183|binding|INFO|Setting lport 387def1f-1379-4223-bac4-15b131c92566 down in Southbound
Dec  5 07:59:27 np0005546954 ovn_controller[95566]: 2025-12-05T12:59:27Z|00184|binding|INFO|Removing iface tap387def1f-13 ovn-installed in OVS
Dec  5 07:59:27 np0005546954 nova_compute[187160]: 2025-12-05 12:59:27.208 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:27.232 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2f:77:70 10.100.0.13'], port_security=['fa:16:3e:2f:77:70 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '4287ea95-6b7c-4583-ba8c-a9fdca606587', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4389bc8-2898-48b0-9741-5183b54fe83c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e6ae0d0dcde04b85b6dae45560cca988', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9ea68f98-ae7c-4c35-bc5a-7c1a27f7e5f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb60c317-acba-4c06-b29b-f7c6c7a5660a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=387def1f-1379-4223-bac4-15b131c92566) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:59:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:27.234 104428 INFO neutron.agent.ovn.metadata.agent [-] Port 387def1f-1379-4223-bac4-15b131c92566 in datapath d4389bc8-2898-48b0-9741-5183b54fe83c unbound from our chassis#033[00m
Dec  5 07:59:27 np0005546954 nova_compute[187160]: 2025-12-05 12:59:27.236 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:27.237 104428 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d4389bc8-2898-48b0-9741-5183b54fe83c#033[00m
Dec  5 07:59:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:27.255 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[00c8d629-539f-48f0-b94e-31784f50970b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:59:27 np0005546954 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000012.scope: Deactivated successfully.
Dec  5 07:59:27 np0005546954 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000012.scope: Consumed 15.072s CPU time.
Dec  5 07:59:27 np0005546954 systemd-machined[153497]: Machine qemu-16-instance-00000012 terminated.
Dec  5 07:59:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:27.285 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[3d3d0c74-9e7a-4d0b-bb89-900c0de5afd0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:59:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:27.287 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[637ba090-3d68-4428-9891-527461bae4d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:59:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:27.316 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[4a2738a2-5407-4092-8bf6-1a72e8239a98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:59:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:27.334 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[60cb6e1a-e188-4128-97ec-9e15a1412e38]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd4389bc8-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:43:f7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 48], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 450948, 'reachable_time': 27184, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215064, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:59:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:27.351 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[0e25f2c2-d129-4f7d-be33-a66690f3855e]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd4389bc8-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 450959, 'tstamp': 450959}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215065, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd4389bc8-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 450963, 'tstamp': 450963}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215065, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:59:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:27.352 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4389bc8-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:59:27 np0005546954 nova_compute[187160]: 2025-12-05 12:59:27.354 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:27 np0005546954 nova_compute[187160]: 2025-12-05 12:59:27.358 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:27.358 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd4389bc8-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:59:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:27.359 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:59:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:27.360 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd4389bc8-20, col_values=(('external_ids', {'iface-id': '8dbe2af5-9f18-44ca-8f22-66854bcdd596'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:59:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:27.360 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:59:27 np0005546954 nova_compute[187160]: 2025-12-05 12:59:27.436 187164 INFO nova.virt.libvirt.driver [-] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Instance destroyed successfully.#033[00m
Dec  5 07:59:27 np0005546954 nova_compute[187160]: 2025-12-05 12:59:27.437 187164 DEBUG nova.objects.instance [None req-2d8dd6b6-51b0-4289-bdfe-c396418b9714 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lazy-loading 'resources' on Instance uuid 4287ea95-6b7c-4583-ba8c-a9fdca606587 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:59:27 np0005546954 nova_compute[187160]: 2025-12-05 12:59:27.460 187164 DEBUG nova.virt.libvirt.vif [None req-2d8dd6b6-51b0-4289-bdfe-c396418b9714 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:58:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-252433423',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-252433423',id=18,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:58:23Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e6ae0d0dcde04b85b6dae45560cca988',ramdisk_id='',reservation_id='r-gihsj0d3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-192029678',owner_user_name='tempest-TestExecuteStrategies-192029678-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:58:23Z,user_data=None,user_id='0ae0bb20ac8b4be99eb1abddc7310436',uuid=4287ea95-6b7c-4583-ba8c-a9fdca606587,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "387def1f-1379-4223-bac4-15b131c92566", "address": "fa:16:3e:2f:77:70", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap387def1f-13", "ovs_interfaceid": "387def1f-1379-4223-bac4-15b131c92566", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:59:27 np0005546954 nova_compute[187160]: 2025-12-05 12:59:27.460 187164 DEBUG nova.network.os_vif_util [None req-2d8dd6b6-51b0-4289-bdfe-c396418b9714 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converting VIF {"id": "387def1f-1379-4223-bac4-15b131c92566", "address": "fa:16:3e:2f:77:70", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap387def1f-13", "ovs_interfaceid": "387def1f-1379-4223-bac4-15b131c92566", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:59:27 np0005546954 nova_compute[187160]: 2025-12-05 12:59:27.461 187164 DEBUG nova.network.os_vif_util [None req-2d8dd6b6-51b0-4289-bdfe-c396418b9714 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2f:77:70,bridge_name='br-int',has_traffic_filtering=True,id=387def1f-1379-4223-bac4-15b131c92566,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap387def1f-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:59:27 np0005546954 nova_compute[187160]: 2025-12-05 12:59:27.461 187164 DEBUG os_vif [None req-2d8dd6b6-51b0-4289-bdfe-c396418b9714 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2f:77:70,bridge_name='br-int',has_traffic_filtering=True,id=387def1f-1379-4223-bac4-15b131c92566,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap387def1f-13') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:59:27 np0005546954 nova_compute[187160]: 2025-12-05 12:59:27.462 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:27 np0005546954 nova_compute[187160]: 2025-12-05 12:59:27.463 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap387def1f-13, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:59:27 np0005546954 nova_compute[187160]: 2025-12-05 12:59:27.464 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:27 np0005546954 nova_compute[187160]: 2025-12-05 12:59:27.465 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:59:27 np0005546954 nova_compute[187160]: 2025-12-05 12:59:27.468 187164 INFO os_vif [None req-2d8dd6b6-51b0-4289-bdfe-c396418b9714 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2f:77:70,bridge_name='br-int',has_traffic_filtering=True,id=387def1f-1379-4223-bac4-15b131c92566,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap387def1f-13')#033[00m
Dec  5 07:59:27 np0005546954 nova_compute[187160]: 2025-12-05 12:59:27.468 187164 INFO nova.virt.libvirt.driver [None req-2d8dd6b6-51b0-4289-bdfe-c396418b9714 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Deleting instance files /var/lib/nova/instances/4287ea95-6b7c-4583-ba8c-a9fdca606587_del#033[00m
Dec  5 07:59:27 np0005546954 nova_compute[187160]: 2025-12-05 12:59:27.469 187164 INFO nova.virt.libvirt.driver [None req-2d8dd6b6-51b0-4289-bdfe-c396418b9714 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Deletion of /var/lib/nova/instances/4287ea95-6b7c-4583-ba8c-a9fdca606587_del complete#033[00m
Dec  5 07:59:27 np0005546954 nova_compute[187160]: 2025-12-05 12:59:27.513 187164 INFO nova.compute.manager [None req-2d8dd6b6-51b0-4289-bdfe-c396418b9714 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:59:27 np0005546954 nova_compute[187160]: 2025-12-05 12:59:27.514 187164 DEBUG oslo.service.loopingcall [None req-2d8dd6b6-51b0-4289-bdfe-c396418b9714 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:59:27 np0005546954 nova_compute[187160]: 2025-12-05 12:59:27.514 187164 DEBUG nova.compute.manager [-] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:59:27 np0005546954 nova_compute[187160]: 2025-12-05 12:59:27.514 187164 DEBUG nova.network.neutron [-] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:59:27 np0005546954 nova_compute[187160]: 2025-12-05 12:59:27.943 187164 DEBUG nova.compute.manager [req-baba81b7-f2fd-43db-85ed-b5a8530c963b req-7e209262-3eb8-47eb-ab55-c0dc0e943731 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Received event network-vif-unplugged-387def1f-1379-4223-bac4-15b131c92566 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:59:27 np0005546954 nova_compute[187160]: 2025-12-05 12:59:27.944 187164 DEBUG oslo_concurrency.lockutils [req-baba81b7-f2fd-43db-85ed-b5a8530c963b req-7e209262-3eb8-47eb-ab55-c0dc0e943731 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "4287ea95-6b7c-4583-ba8c-a9fdca606587-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:59:27 np0005546954 nova_compute[187160]: 2025-12-05 12:59:27.944 187164 DEBUG oslo_concurrency.lockutils [req-baba81b7-f2fd-43db-85ed-b5a8530c963b req-7e209262-3eb8-47eb-ab55-c0dc0e943731 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "4287ea95-6b7c-4583-ba8c-a9fdca606587-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:59:27 np0005546954 nova_compute[187160]: 2025-12-05 12:59:27.945 187164 DEBUG oslo_concurrency.lockutils [req-baba81b7-f2fd-43db-85ed-b5a8530c963b req-7e209262-3eb8-47eb-ab55-c0dc0e943731 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "4287ea95-6b7c-4583-ba8c-a9fdca606587-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:59:27 np0005546954 nova_compute[187160]: 2025-12-05 12:59:27.945 187164 DEBUG nova.compute.manager [req-baba81b7-f2fd-43db-85ed-b5a8530c963b req-7e209262-3eb8-47eb-ab55-c0dc0e943731 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] No waiting events found dispatching network-vif-unplugged-387def1f-1379-4223-bac4-15b131c92566 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:59:27 np0005546954 nova_compute[187160]: 2025-12-05 12:59:27.945 187164 DEBUG nova.compute.manager [req-baba81b7-f2fd-43db-85ed-b5a8530c963b req-7e209262-3eb8-47eb-ab55-c0dc0e943731 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Received event network-vif-unplugged-387def1f-1379-4223-bac4-15b131c92566 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  5 07:59:28 np0005546954 nova_compute[187160]: 2025-12-05 12:59:28.228 187164 DEBUG nova.network.neutron [-] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:59:28 np0005546954 nova_compute[187160]: 2025-12-05 12:59:28.259 187164 INFO nova.compute.manager [-] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Took 0.74 seconds to deallocate network for instance.#033[00m
Dec  5 07:59:28 np0005546954 nova_compute[187160]: 2025-12-05 12:59:28.357 187164 DEBUG oslo_concurrency.lockutils [None req-2d8dd6b6-51b0-4289-bdfe-c396418b9714 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:59:28 np0005546954 nova_compute[187160]: 2025-12-05 12:59:28.357 187164 DEBUG oslo_concurrency.lockutils [None req-2d8dd6b6-51b0-4289-bdfe-c396418b9714 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:59:28 np0005546954 nova_compute[187160]: 2025-12-05 12:59:28.427 187164 DEBUG nova.compute.provider_tree [None req-2d8dd6b6-51b0-4289-bdfe-c396418b9714 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:59:28 np0005546954 nova_compute[187160]: 2025-12-05 12:59:28.683 187164 DEBUG nova.scheduler.client.report [None req-2d8dd6b6-51b0-4289-bdfe-c396418b9714 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:59:29 np0005546954 nova_compute[187160]: 2025-12-05 12:59:29.009 187164 DEBUG oslo_concurrency.lockutils [None req-2d8dd6b6-51b0-4289-bdfe-c396418b9714 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.652s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:59:29 np0005546954 nova_compute[187160]: 2025-12-05 12:59:29.232 187164 INFO nova.scheduler.client.report [None req-2d8dd6b6-51b0-4289-bdfe-c396418b9714 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Deleted allocations for instance 4287ea95-6b7c-4583-ba8c-a9fdca606587#033[00m
Dec  5 07:59:29 np0005546954 nova_compute[187160]: 2025-12-05 12:59:29.593 187164 DEBUG oslo_concurrency.lockutils [None req-2d8dd6b6-51b0-4289-bdfe-c396418b9714 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "4287ea95-6b7c-4583-ba8c-a9fdca606587" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.440s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:59:30 np0005546954 nova_compute[187160]: 2025-12-05 12:59:30.019 187164 DEBUG nova.compute.manager [req-dee2758a-91cd-458e-ac99-e6ffd4a6cc2b req-1e5166fd-03bf-45f0-a130-242f7d03b16d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Received event network-vif-plugged-387def1f-1379-4223-bac4-15b131c92566 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:59:30 np0005546954 nova_compute[187160]: 2025-12-05 12:59:30.020 187164 DEBUG oslo_concurrency.lockutils [req-dee2758a-91cd-458e-ac99-e6ffd4a6cc2b req-1e5166fd-03bf-45f0-a130-242f7d03b16d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "4287ea95-6b7c-4583-ba8c-a9fdca606587-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:59:30 np0005546954 nova_compute[187160]: 2025-12-05 12:59:30.020 187164 DEBUG oslo_concurrency.lockutils [req-dee2758a-91cd-458e-ac99-e6ffd4a6cc2b req-1e5166fd-03bf-45f0-a130-242f7d03b16d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "4287ea95-6b7c-4583-ba8c-a9fdca606587-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:59:30 np0005546954 nova_compute[187160]: 2025-12-05 12:59:30.020 187164 DEBUG oslo_concurrency.lockutils [req-dee2758a-91cd-458e-ac99-e6ffd4a6cc2b req-1e5166fd-03bf-45f0-a130-242f7d03b16d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "4287ea95-6b7c-4583-ba8c-a9fdca606587-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:59:30 np0005546954 nova_compute[187160]: 2025-12-05 12:59:30.021 187164 DEBUG nova.compute.manager [req-dee2758a-91cd-458e-ac99-e6ffd4a6cc2b req-1e5166fd-03bf-45f0-a130-242f7d03b16d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] No waiting events found dispatching network-vif-plugged-387def1f-1379-4223-bac4-15b131c92566 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:59:30 np0005546954 nova_compute[187160]: 2025-12-05 12:59:30.021 187164 WARNING nova.compute.manager [req-dee2758a-91cd-458e-ac99-e6ffd4a6cc2b req-1e5166fd-03bf-45f0-a130-242f7d03b16d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Received unexpected event network-vif-plugged-387def1f-1379-4223-bac4-15b131c92566 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:59:30 np0005546954 nova_compute[187160]: 2025-12-05 12:59:30.021 187164 DEBUG nova.compute.manager [req-dee2758a-91cd-458e-ac99-e6ffd4a6cc2b req-1e5166fd-03bf-45f0-a130-242f7d03b16d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Received event network-vif-deleted-387def1f-1379-4223-bac4-15b131c92566 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:59:30 np0005546954 nova_compute[187160]: 2025-12-05 12:59:30.689 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:31 np0005546954 nova_compute[187160]: 2025-12-05 12:59:31.326 187164 DEBUG oslo_concurrency.lockutils [None req-16e24e17-7679-4476-a6ef-15801f4da893 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "a3590805-f796-4ac1-9051-0976e21b76dd" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:59:31 np0005546954 nova_compute[187160]: 2025-12-05 12:59:31.326 187164 DEBUG oslo_concurrency.lockutils [None req-16e24e17-7679-4476-a6ef-15801f4da893 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "a3590805-f796-4ac1-9051-0976e21b76dd" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:59:31 np0005546954 nova_compute[187160]: 2025-12-05 12:59:31.327 187164 DEBUG oslo_concurrency.lockutils [None req-16e24e17-7679-4476-a6ef-15801f4da893 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "a3590805-f796-4ac1-9051-0976e21b76dd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:59:31 np0005546954 nova_compute[187160]: 2025-12-05 12:59:31.327 187164 DEBUG oslo_concurrency.lockutils [None req-16e24e17-7679-4476-a6ef-15801f4da893 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "a3590805-f796-4ac1-9051-0976e21b76dd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:59:31 np0005546954 nova_compute[187160]: 2025-12-05 12:59:31.328 187164 DEBUG oslo_concurrency.lockutils [None req-16e24e17-7679-4476-a6ef-15801f4da893 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "a3590805-f796-4ac1-9051-0976e21b76dd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:59:31 np0005546954 nova_compute[187160]: 2025-12-05 12:59:31.330 187164 INFO nova.compute.manager [None req-16e24e17-7679-4476-a6ef-15801f4da893 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: a3590805-f796-4ac1-9051-0976e21b76dd] Terminating instance#033[00m
Dec  5 07:59:31 np0005546954 nova_compute[187160]: 2025-12-05 12:59:31.331 187164 DEBUG nova.compute.manager [None req-16e24e17-7679-4476-a6ef-15801f4da893 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: a3590805-f796-4ac1-9051-0976e21b76dd] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:59:31 np0005546954 kernel: tap80b9305c-89 (unregistering): left promiscuous mode
Dec  5 07:59:31 np0005546954 NetworkManager[55665]: <info>  [1764939571.3696] device (tap80b9305c-89): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:59:31 np0005546954 ovn_controller[95566]: 2025-12-05T12:59:31Z|00185|binding|INFO|Releasing lport 80b9305c-89c9-4475-8764-be7039f28636 from this chassis (sb_readonly=0)
Dec  5 07:59:31 np0005546954 ovn_controller[95566]: 2025-12-05T12:59:31Z|00186|binding|INFO|Setting lport 80b9305c-89c9-4475-8764-be7039f28636 down in Southbound
Dec  5 07:59:31 np0005546954 nova_compute[187160]: 2025-12-05 12:59:31.423 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:31 np0005546954 ovn_controller[95566]: 2025-12-05T12:59:31Z|00187|binding|INFO|Removing iface tap80b9305c-89 ovn-installed in OVS
Dec  5 07:59:31 np0005546954 nova_compute[187160]: 2025-12-05 12:59:31.426 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:31 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:31.434 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b9:65:c0 10.100.0.14'], port_security=['fa:16:3e:b9:65:c0 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'a3590805-f796-4ac1-9051-0976e21b76dd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4389bc8-2898-48b0-9741-5183b54fe83c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e6ae0d0dcde04b85b6dae45560cca988', 'neutron:revision_number': '13', 'neutron:security_group_ids': '9ea68f98-ae7c-4c35-bc5a-7c1a27f7e5f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb60c317-acba-4c06-b29b-f7c6c7a5660a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=80b9305c-89c9-4475-8764-be7039f28636) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:59:31 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:31.436 104428 INFO neutron.agent.ovn.metadata.agent [-] Port 80b9305c-89c9-4475-8764-be7039f28636 in datapath d4389bc8-2898-48b0-9741-5183b54fe83c unbound from our chassis#033[00m
Dec  5 07:59:31 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:31.439 104428 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d4389bc8-2898-48b0-9741-5183b54fe83c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:59:31 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:31.440 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[0dab4348-d5da-4c87-9cec-85d8a1b36a6d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:59:31 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:31.441 104428 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c namespace which is not needed anymore#033[00m
Dec  5 07:59:31 np0005546954 nova_compute[187160]: 2025-12-05 12:59:31.447 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:31 np0005546954 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000011.scope: Deactivated successfully.
Dec  5 07:59:31 np0005546954 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000011.scope: Consumed 2.347s CPU time.
Dec  5 07:59:31 np0005546954 systemd-machined[153497]: Machine qemu-17-instance-00000011 terminated.
Dec  5 07:59:31 np0005546954 nova_compute[187160]: 2025-12-05 12:59:31.610 187164 INFO nova.virt.libvirt.driver [-] [instance: a3590805-f796-4ac1-9051-0976e21b76dd] Instance destroyed successfully.#033[00m
Dec  5 07:59:31 np0005546954 nova_compute[187160]: 2025-12-05 12:59:31.611 187164 DEBUG nova.objects.instance [None req-16e24e17-7679-4476-a6ef-15801f4da893 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lazy-loading 'resources' on Instance uuid a3590805-f796-4ac1-9051-0976e21b76dd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:59:31 np0005546954 nova_compute[187160]: 2025-12-05 12:59:31.628 187164 DEBUG nova.virt.libvirt.vif [None req-16e24e17-7679-4476-a6ef-15801f4da893 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T12:57:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1601943028',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1601943028',id=17,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:58:10Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e6ae0d0dcde04b85b6dae45560cca988',ramdisk_id='',reservation_id='r-8ld7dxoq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-192029678',owner_user_name='tempest-TestExecuteStrategies-192029678-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:59:23Z,user_data=None,user_id='0ae0bb20ac8b4be99eb1abddc7310436',uuid=a3590805-f796-4ac1-9051-0976e21b76dd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "80b9305c-89c9-4475-8764-be7039f28636", "address": "fa:16:3e:b9:65:c0", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap80b9305c-89", "ovs_interfaceid": "80b9305c-89c9-4475-8764-be7039f28636", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:59:31 np0005546954 nova_compute[187160]: 2025-12-05 12:59:31.630 187164 DEBUG nova.network.os_vif_util [None req-16e24e17-7679-4476-a6ef-15801f4da893 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converting VIF {"id": "80b9305c-89c9-4475-8764-be7039f28636", "address": "fa:16:3e:b9:65:c0", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap80b9305c-89", "ovs_interfaceid": "80b9305c-89c9-4475-8764-be7039f28636", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:59:31 np0005546954 nova_compute[187160]: 2025-12-05 12:59:31.631 187164 DEBUG nova.network.os_vif_util [None req-16e24e17-7679-4476-a6ef-15801f4da893 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b9:65:c0,bridge_name='br-int',has_traffic_filtering=True,id=80b9305c-89c9-4475-8764-be7039f28636,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap80b9305c-89') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:59:31 np0005546954 nova_compute[187160]: 2025-12-05 12:59:31.631 187164 DEBUG os_vif [None req-16e24e17-7679-4476-a6ef-15801f4da893 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b9:65:c0,bridge_name='br-int',has_traffic_filtering=True,id=80b9305c-89c9-4475-8764-be7039f28636,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap80b9305c-89') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:59:31 np0005546954 nova_compute[187160]: 2025-12-05 12:59:31.633 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:31 np0005546954 nova_compute[187160]: 2025-12-05 12:59:31.633 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap80b9305c-89, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:59:31 np0005546954 nova_compute[187160]: 2025-12-05 12:59:31.635 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:31 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[214725]: [NOTICE]   (214729) : haproxy version is 2.8.14-c23fe91
Dec  5 07:59:31 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[214725]: [NOTICE]   (214729) : path to executable is /usr/sbin/haproxy
Dec  5 07:59:31 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[214725]: [WARNING]  (214729) : Exiting Master process...
Dec  5 07:59:31 np0005546954 nova_compute[187160]: 2025-12-05 12:59:31.636 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:31 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[214725]: [ALERT]    (214729) : Current worker (214731) exited with code 143 (Terminated)
Dec  5 07:59:31 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[214725]: [WARNING]  (214729) : All workers exited. Exiting... (0)
Dec  5 07:59:31 np0005546954 nova_compute[187160]: 2025-12-05 12:59:31.639 187164 INFO os_vif [None req-16e24e17-7679-4476-a6ef-15801f4da893 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b9:65:c0,bridge_name='br-int',has_traffic_filtering=True,id=80b9305c-89c9-4475-8764-be7039f28636,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap80b9305c-89')#033[00m
Dec  5 07:59:31 np0005546954 nova_compute[187160]: 2025-12-05 12:59:31.639 187164 INFO nova.virt.libvirt.driver [None req-16e24e17-7679-4476-a6ef-15801f4da893 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: a3590805-f796-4ac1-9051-0976e21b76dd] Deleting instance files /var/lib/nova/instances/a3590805-f796-4ac1-9051-0976e21b76dd_del#033[00m
Dec  5 07:59:31 np0005546954 systemd[1]: libpod-232edc7b453d02761f246ea18c636f6b450d6769bdf0a4fbc3d7012f74fe7550.scope: Deactivated successfully.
Dec  5 07:59:31 np0005546954 nova_compute[187160]: 2025-12-05 12:59:31.640 187164 INFO nova.virt.libvirt.driver [None req-16e24e17-7679-4476-a6ef-15801f4da893 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: a3590805-f796-4ac1-9051-0976e21b76dd] Deletion of /var/lib/nova/instances/a3590805-f796-4ac1-9051-0976e21b76dd_del complete#033[00m
Dec  5 07:59:31 np0005546954 podman[215116]: 2025-12-05 12:59:31.647816963 +0000 UTC m=+0.058525195 container died 232edc7b453d02761f246ea18c636f6b450d6769bdf0a4fbc3d7012f74fe7550 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  5 07:59:31 np0005546954 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-232edc7b453d02761f246ea18c636f6b450d6769bdf0a4fbc3d7012f74fe7550-userdata-shm.mount: Deactivated successfully.
Dec  5 07:59:31 np0005546954 systemd[1]: var-lib-containers-storage-overlay-40d28398048d3be33914f9ee9d5fcbf16984d882fe15f83a1d3b5c090aa553cf-merged.mount: Deactivated successfully.
Dec  5 07:59:31 np0005546954 podman[215116]: 2025-12-05 12:59:31.681867895 +0000 UTC m=+0.092576127 container cleanup 232edc7b453d02761f246ea18c636f6b450d6769bdf0a4fbc3d7012f74fe7550 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Dec  5 07:59:31 np0005546954 nova_compute[187160]: 2025-12-05 12:59:31.698 187164 INFO nova.compute.manager [None req-16e24e17-7679-4476-a6ef-15801f4da893 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: a3590805-f796-4ac1-9051-0976e21b76dd] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:59:31 np0005546954 nova_compute[187160]: 2025-12-05 12:59:31.699 187164 DEBUG oslo.service.loopingcall [None req-16e24e17-7679-4476-a6ef-15801f4da893 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:59:31 np0005546954 nova_compute[187160]: 2025-12-05 12:59:31.700 187164 DEBUG nova.compute.manager [-] [instance: a3590805-f796-4ac1-9051-0976e21b76dd] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:59:31 np0005546954 nova_compute[187160]: 2025-12-05 12:59:31.700 187164 DEBUG nova.network.neutron [-] [instance: a3590805-f796-4ac1-9051-0976e21b76dd] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:59:31 np0005546954 systemd[1]: libpod-conmon-232edc7b453d02761f246ea18c636f6b450d6769bdf0a4fbc3d7012f74fe7550.scope: Deactivated successfully.
Dec  5 07:59:31 np0005546954 podman[215156]: 2025-12-05 12:59:31.743741874 +0000 UTC m=+0.042249508 container remove 232edc7b453d02761f246ea18c636f6b450d6769bdf0a4fbc3d7012f74fe7550 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Dec  5 07:59:31 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:31.748 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[27414dd4-435e-4701-bcae-e5cba963619f]: (4, ('Fri Dec  5 12:59:31 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c (232edc7b453d02761f246ea18c636f6b450d6769bdf0a4fbc3d7012f74fe7550)\n232edc7b453d02761f246ea18c636f6b450d6769bdf0a4fbc3d7012f74fe7550\nFri Dec  5 12:59:31 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c (232edc7b453d02761f246ea18c636f6b450d6769bdf0a4fbc3d7012f74fe7550)\n232edc7b453d02761f246ea18c636f6b450d6769bdf0a4fbc3d7012f74fe7550\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:59:31 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:31.750 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[c2f90b98-7aeb-4a25-94cd-fb0cecc15778]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:59:31 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:31.751 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4389bc8-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:59:31 np0005546954 nova_compute[187160]: 2025-12-05 12:59:31.752 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:31 np0005546954 kernel: tapd4389bc8-20: left promiscuous mode
Dec  5 07:59:31 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:31.757 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[de634f07-a386-4547-996c-f7f1404b4f2e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:59:31 np0005546954 nova_compute[187160]: 2025-12-05 12:59:31.767 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:31 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:31.794 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[8dda9454-523f-4800-baf4-ef076daeb808]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:59:31 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:31.796 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[f5587c6d-0324-4a93-97ee-45dd89434c5c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:59:31 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:31.812 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[00e7f9a1-4432-4b00-9fde-a2b8f00d6e2a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 450942, 'reachable_time': 15718, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215173, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:59:31 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:31.814 104542 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:59:31 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 12:59:31.815 104542 DEBUG oslo.privsep.daemon [-] privsep: reply[10b0f770-bed9-4705-9ed6-0768f7568b54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:59:31 np0005546954 systemd[1]: run-netns-ovnmeta\x2dd4389bc8\x2d2898\x2d48b0\x2d9741\x2d5183b54fe83c.mount: Deactivated successfully.
Dec  5 07:59:32 np0005546954 nova_compute[187160]: 2025-12-05 12:59:32.097 187164 DEBUG nova.compute.manager [req-ceb46884-6f91-428d-9919-1f31156ce315 req-868a240d-ac18-425a-b82d-5ed1cf3e6eba 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a3590805-f796-4ac1-9051-0976e21b76dd] Received event network-vif-unplugged-80b9305c-89c9-4475-8764-be7039f28636 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:59:32 np0005546954 nova_compute[187160]: 2025-12-05 12:59:32.098 187164 DEBUG oslo_concurrency.lockutils [req-ceb46884-6f91-428d-9919-1f31156ce315 req-868a240d-ac18-425a-b82d-5ed1cf3e6eba 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "a3590805-f796-4ac1-9051-0976e21b76dd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:59:32 np0005546954 nova_compute[187160]: 2025-12-05 12:59:32.098 187164 DEBUG oslo_concurrency.lockutils [req-ceb46884-6f91-428d-9919-1f31156ce315 req-868a240d-ac18-425a-b82d-5ed1cf3e6eba 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "a3590805-f796-4ac1-9051-0976e21b76dd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:59:32 np0005546954 nova_compute[187160]: 2025-12-05 12:59:32.098 187164 DEBUG oslo_concurrency.lockutils [req-ceb46884-6f91-428d-9919-1f31156ce315 req-868a240d-ac18-425a-b82d-5ed1cf3e6eba 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "a3590805-f796-4ac1-9051-0976e21b76dd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:59:32 np0005546954 nova_compute[187160]: 2025-12-05 12:59:32.098 187164 DEBUG nova.compute.manager [req-ceb46884-6f91-428d-9919-1f31156ce315 req-868a240d-ac18-425a-b82d-5ed1cf3e6eba 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a3590805-f796-4ac1-9051-0976e21b76dd] No waiting events found dispatching network-vif-unplugged-80b9305c-89c9-4475-8764-be7039f28636 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:59:32 np0005546954 nova_compute[187160]: 2025-12-05 12:59:32.099 187164 DEBUG nova.compute.manager [req-ceb46884-6f91-428d-9919-1f31156ce315 req-868a240d-ac18-425a-b82d-5ed1cf3e6eba 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a3590805-f796-4ac1-9051-0976e21b76dd] Received event network-vif-unplugged-80b9305c-89c9-4475-8764-be7039f28636 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  5 07:59:32 np0005546954 nova_compute[187160]: 2025-12-05 12:59:32.328 187164 DEBUG nova.network.neutron [-] [instance: a3590805-f796-4ac1-9051-0976e21b76dd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:59:32 np0005546954 nova_compute[187160]: 2025-12-05 12:59:32.342 187164 INFO nova.compute.manager [-] [instance: a3590805-f796-4ac1-9051-0976e21b76dd] Took 0.64 seconds to deallocate network for instance.#033[00m
Dec  5 07:59:32 np0005546954 nova_compute[187160]: 2025-12-05 12:59:32.390 187164 DEBUG oslo_concurrency.lockutils [None req-16e24e17-7679-4476-a6ef-15801f4da893 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:59:32 np0005546954 nova_compute[187160]: 2025-12-05 12:59:32.391 187164 DEBUG oslo_concurrency.lockutils [None req-16e24e17-7679-4476-a6ef-15801f4da893 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:59:32 np0005546954 nova_compute[187160]: 2025-12-05 12:59:32.396 187164 DEBUG oslo_concurrency.lockutils [None req-16e24e17-7679-4476-a6ef-15801f4da893 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:59:32 np0005546954 nova_compute[187160]: 2025-12-05 12:59:32.425 187164 INFO nova.scheduler.client.report [None req-16e24e17-7679-4476-a6ef-15801f4da893 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Deleted allocations for instance a3590805-f796-4ac1-9051-0976e21b76dd#033[00m
Dec  5 07:59:32 np0005546954 nova_compute[187160]: 2025-12-05 12:59:32.496 187164 DEBUG oslo_concurrency.lockutils [None req-16e24e17-7679-4476-a6ef-15801f4da893 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "a3590805-f796-4ac1-9051-0976e21b76dd" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:59:33 np0005546954 podman[215174]: 2025-12-05 12:59:33.578940423 +0000 UTC m=+0.080669185 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec  5 07:59:34 np0005546954 nova_compute[187160]: 2025-12-05 12:59:34.169 187164 DEBUG nova.compute.manager [req-5ef9bd7a-ce9c-4cd0-9fcd-16b3b18bcb6e req-2a6fb88e-1338-4ddf-8060-e6201df305fd 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a3590805-f796-4ac1-9051-0976e21b76dd] Received event network-vif-plugged-80b9305c-89c9-4475-8764-be7039f28636 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:59:34 np0005546954 nova_compute[187160]: 2025-12-05 12:59:34.170 187164 DEBUG oslo_concurrency.lockutils [req-5ef9bd7a-ce9c-4cd0-9fcd-16b3b18bcb6e req-2a6fb88e-1338-4ddf-8060-e6201df305fd 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "a3590805-f796-4ac1-9051-0976e21b76dd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:59:34 np0005546954 nova_compute[187160]: 2025-12-05 12:59:34.171 187164 DEBUG oslo_concurrency.lockutils [req-5ef9bd7a-ce9c-4cd0-9fcd-16b3b18bcb6e req-2a6fb88e-1338-4ddf-8060-e6201df305fd 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "a3590805-f796-4ac1-9051-0976e21b76dd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:59:34 np0005546954 nova_compute[187160]: 2025-12-05 12:59:34.171 187164 DEBUG oslo_concurrency.lockutils [req-5ef9bd7a-ce9c-4cd0-9fcd-16b3b18bcb6e req-2a6fb88e-1338-4ddf-8060-e6201df305fd 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "a3590805-f796-4ac1-9051-0976e21b76dd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:59:34 np0005546954 nova_compute[187160]: 2025-12-05 12:59:34.172 187164 DEBUG nova.compute.manager [req-5ef9bd7a-ce9c-4cd0-9fcd-16b3b18bcb6e req-2a6fb88e-1338-4ddf-8060-e6201df305fd 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a3590805-f796-4ac1-9051-0976e21b76dd] No waiting events found dispatching network-vif-plugged-80b9305c-89c9-4475-8764-be7039f28636 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:59:34 np0005546954 nova_compute[187160]: 2025-12-05 12:59:34.172 187164 WARNING nova.compute.manager [req-5ef9bd7a-ce9c-4cd0-9fcd-16b3b18bcb6e req-2a6fb88e-1338-4ddf-8060-e6201df305fd 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a3590805-f796-4ac1-9051-0976e21b76dd] Received unexpected event network-vif-plugged-80b9305c-89c9-4475-8764-be7039f28636 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:59:34 np0005546954 nova_compute[187160]: 2025-12-05 12:59:34.172 187164 DEBUG nova.compute.manager [req-5ef9bd7a-ce9c-4cd0-9fcd-16b3b18bcb6e req-2a6fb88e-1338-4ddf-8060-e6201df305fd 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a3590805-f796-4ac1-9051-0976e21b76dd] Received event network-vif-deleted-80b9305c-89c9-4475-8764-be7039f28636 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:59:35 np0005546954 podman[197513]: time="2025-12-05T12:59:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 07:59:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:59:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 07:59:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:12:59:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2591 "" "Go-http-client/1.1"
Dec  5 07:59:35 np0005546954 nova_compute[187160]: 2025-12-05 12:59:35.689 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:36 np0005546954 nova_compute[187160]: 2025-12-05 12:59:36.635 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:38 np0005546954 podman[215194]: 2025-12-05 12:59:38.57500214 +0000 UTC m=+0.081371638 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 07:59:38 np0005546954 podman[215193]: 2025-12-05 12:59:38.606291314 +0000 UTC m=+0.118231016 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec  5 07:59:40 np0005546954 nova_compute[187160]: 2025-12-05 12:59:40.691 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:41 np0005546954 nova_compute[187160]: 2025-12-05 12:59:41.638 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:42 np0005546954 nova_compute[187160]: 2025-12-05 12:59:42.434 187164 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764939567.434076, 4287ea95-6b7c-4583-ba8c-a9fdca606587 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:59:42 np0005546954 nova_compute[187160]: 2025-12-05 12:59:42.435 187164 INFO nova.compute.manager [-] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:59:42 np0005546954 nova_compute[187160]: 2025-12-05 12:59:42.458 187164 DEBUG nova.compute.manager [None req-25185eac-f55e-4d2c-b60d-95a0d0161873 - - - - - -] [instance: 4287ea95-6b7c-4583-ba8c-a9fdca606587] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:59:45 np0005546954 nova_compute[187160]: 2025-12-05 12:59:45.693 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:46 np0005546954 nova_compute[187160]: 2025-12-05 12:59:46.609 187164 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764939571.6079595, a3590805-f796-4ac1-9051-0976e21b76dd => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:59:46 np0005546954 nova_compute[187160]: 2025-12-05 12:59:46.610 187164 INFO nova.compute.manager [-] [instance: a3590805-f796-4ac1-9051-0976e21b76dd] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:59:46 np0005546954 nova_compute[187160]: 2025-12-05 12:59:46.640 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:46 np0005546954 nova_compute[187160]: 2025-12-05 12:59:46.671 187164 DEBUG nova.compute.manager [None req-abeb947d-017d-4311-b494-c8c1c84d8c86 - - - - - -] [instance: a3590805-f796-4ac1-9051-0976e21b76dd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:59:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:59:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 07:59:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:59:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:59:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:59:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 07:59:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:59:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 07:59:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:59:49 np0005546954 openstack_network_exporter[199661]: ERROR   12:59:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 07:59:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 07:59:50 np0005546954 nova_compute[187160]: 2025-12-05 12:59:50.696 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:51 np0005546954 podman[215243]: 2025-12-05 12:59:51.554868741 +0000 UTC m=+0.062537610 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec  5 07:59:51 np0005546954 podman[215242]: 2025-12-05 12:59:51.561423566 +0000 UTC m=+0.073119301 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.tags=minimal rhel9, config_id=edpm, com.redhat.component=ubi9-minimal-container, architecture=x86_64, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=)
Dec  5 07:59:51 np0005546954 nova_compute[187160]: 2025-12-05 12:59:51.642 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:55 np0005546954 nova_compute[187160]: 2025-12-05 12:59:55.699 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:56 np0005546954 nova_compute[187160]: 2025-12-05 12:59:56.645 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:59:59 np0005546954 nova_compute[187160]: 2025-12-05 12:59:59.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:00:00 np0005546954 nova_compute[187160]: 2025-12-05 13:00:00.701 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:00:01 np0005546954 nova_compute[187160]: 2025-12-05 13:00:01.647 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:00:04 np0005546954 podman[215282]: 2025-12-05 13:00:04.597321995 +0000 UTC m=+0.089973556 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  5 08:00:05 np0005546954 nova_compute[187160]: 2025-12-05 13:00:05.050 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:00:05 np0005546954 nova_compute[187160]: 2025-12-05 13:00:05.050 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 08:00:05 np0005546954 nova_compute[187160]: 2025-12-05 13:00:05.050 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 08:00:05 np0005546954 nova_compute[187160]: 2025-12-05 13:00:05.064 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 08:00:05 np0005546954 nova_compute[187160]: 2025-12-05 13:00:05.065 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:00:05 np0005546954 podman[197513]: time="2025-12-05T13:00:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:00:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:00:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 08:00:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:00:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2596 "" "Go-http-client/1.1"
Dec  5 08:00:05 np0005546954 nova_compute[187160]: 2025-12-05 13:00:05.703 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:00:06 np0005546954 nova_compute[187160]: 2025-12-05 13:00:06.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:00:06 np0005546954 nova_compute[187160]: 2025-12-05 13:00:06.649 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:00:07 np0005546954 nova_compute[187160]: 2025-12-05 13:00:07.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:00:07 np0005546954 nova_compute[187160]: 2025-12-05 13:00:07.039 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  5 08:00:07 np0005546954 nova_compute[187160]: 2025-12-05 13:00:07.072 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  5 08:00:09 np0005546954 nova_compute[187160]: 2025-12-05 13:00:09.073 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:00:09 np0005546954 podman[215303]: 2025-12-05 13:00:09.621642552 +0000 UTC m=+0.118960210 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 08:00:09 np0005546954 podman[215302]: 2025-12-05 13:00:09.637776085 +0000 UTC m=+0.140182901 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  5 08:00:10 np0005546954 nova_compute[187160]: 2025-12-05 13:00:10.034 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:00:10 np0005546954 nova_compute[187160]: 2025-12-05 13:00:10.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:00:10 np0005546954 nova_compute[187160]: 2025-12-05 13:00:10.203 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:00:10 np0005546954 nova_compute[187160]: 2025-12-05 13:00:10.203 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:00:10 np0005546954 nova_compute[187160]: 2025-12-05 13:00:10.203 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:00:10 np0005546954 nova_compute[187160]: 2025-12-05 13:00:10.204 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 08:00:10 np0005546954 nova_compute[187160]: 2025-12-05 13:00:10.401 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 08:00:10 np0005546954 nova_compute[187160]: 2025-12-05 13:00:10.402 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5870MB free_disk=73.33578872680664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 08:00:10 np0005546954 nova_compute[187160]: 2025-12-05 13:00:10.403 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:00:10 np0005546954 nova_compute[187160]: 2025-12-05 13:00:10.403 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:00:10 np0005546954 nova_compute[187160]: 2025-12-05 13:00:10.561 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 08:00:10 np0005546954 nova_compute[187160]: 2025-12-05 13:00:10.561 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 08:00:10 np0005546954 nova_compute[187160]: 2025-12-05 13:00:10.628 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 08:00:10 np0005546954 nova_compute[187160]: 2025-12-05 13:00:10.705 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:00:10 np0005546954 nova_compute[187160]: 2025-12-05 13:00:10.911 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 08:00:11 np0005546954 nova_compute[187160]: 2025-12-05 13:00:11.360 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 08:00:11 np0005546954 nova_compute[187160]: 2025-12-05 13:00:11.361 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.958s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:00:11 np0005546954 nova_compute[187160]: 2025-12-05 13:00:11.362 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:00:11 np0005546954 nova_compute[187160]: 2025-12-05 13:00:11.362 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  5 08:00:11 np0005546954 nova_compute[187160]: 2025-12-05 13:00:11.651 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:00:14 np0005546954 nova_compute[187160]: 2025-12-05 13:00:14.537 187164 DEBUG oslo_concurrency.lockutils [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "a9e7b53a-45dd-415c-9977-b5df73cde5f2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:00:14 np0005546954 nova_compute[187160]: 2025-12-05 13:00:14.538 187164 DEBUG oslo_concurrency.lockutils [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "a9e7b53a-45dd-415c-9977-b5df73cde5f2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:00:14 np0005546954 nova_compute[187160]: 2025-12-05 13:00:14.554 187164 DEBUG nova.compute.manager [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 08:00:14 np0005546954 nova_compute[187160]: 2025-12-05 13:00:14.557 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:00:14 np0005546954 nova_compute[187160]: 2025-12-05 13:00:14.613 187164 DEBUG oslo_concurrency.lockutils [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:00:14 np0005546954 nova_compute[187160]: 2025-12-05 13:00:14.614 187164 DEBUG oslo_concurrency.lockutils [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:00:14 np0005546954 nova_compute[187160]: 2025-12-05 13:00:14.623 187164 DEBUG nova.virt.hardware [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 08:00:14 np0005546954 nova_compute[187160]: 2025-12-05 13:00:14.624 187164 INFO nova.compute.claims [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Claim successful on node compute-1.ctlplane.example.com#033[00m
Dec  5 08:00:15 np0005546954 nova_compute[187160]: 2025-12-05 13:00:15.584 187164 DEBUG nova.compute.provider_tree [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 08:00:15 np0005546954 nova_compute[187160]: 2025-12-05 13:00:15.600 187164 DEBUG nova.scheduler.client.report [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 08:00:15 np0005546954 nova_compute[187160]: 2025-12-05 13:00:15.707 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:00:15 np0005546954 nova_compute[187160]: 2025-12-05 13:00:15.813 187164 DEBUG oslo_concurrency.lockutils [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.199s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:00:15 np0005546954 nova_compute[187160]: 2025-12-05 13:00:15.814 187164 DEBUG nova.compute.manager [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 08:00:15 np0005546954 nova_compute[187160]: 2025-12-05 13:00:15.865 187164 DEBUG nova.compute.manager [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 08:00:15 np0005546954 nova_compute[187160]: 2025-12-05 13:00:15.866 187164 DEBUG nova.network.neutron [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 08:00:15 np0005546954 nova_compute[187160]: 2025-12-05 13:00:15.956 187164 INFO nova.virt.libvirt.driver [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 08:00:16 np0005546954 nova_compute[187160]: 2025-12-05 13:00:16.019 187164 DEBUG nova.compute.manager [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 08:00:16 np0005546954 nova_compute[187160]: 2025-12-05 13:00:16.025 187164 DEBUG nova.policy [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0ae0bb20ac8b4be99eb1abddc7310436', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e6ae0d0dcde04b85b6dae45560cca988', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 08:00:16 np0005546954 nova_compute[187160]: 2025-12-05 13:00:16.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:00:16 np0005546954 nova_compute[187160]: 2025-12-05 13:00:16.524 187164 DEBUG nova.compute.manager [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 08:00:16 np0005546954 nova_compute[187160]: 2025-12-05 13:00:16.525 187164 DEBUG nova.virt.libvirt.driver [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 08:00:16 np0005546954 nova_compute[187160]: 2025-12-05 13:00:16.526 187164 INFO nova.virt.libvirt.driver [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Creating image(s)#033[00m
Dec  5 08:00:16 np0005546954 nova_compute[187160]: 2025-12-05 13:00:16.527 187164 DEBUG oslo_concurrency.lockutils [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "/var/lib/nova/instances/a9e7b53a-45dd-415c-9977-b5df73cde5f2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:00:16 np0005546954 nova_compute[187160]: 2025-12-05 13:00:16.527 187164 DEBUG oslo_concurrency.lockutils [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "/var/lib/nova/instances/a9e7b53a-45dd-415c-9977-b5df73cde5f2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:00:16 np0005546954 nova_compute[187160]: 2025-12-05 13:00:16.528 187164 DEBUG oslo_concurrency.lockutils [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "/var/lib/nova/instances/a9e7b53a-45dd-415c-9977-b5df73cde5f2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:00:16 np0005546954 nova_compute[187160]: 2025-12-05 13:00:16.543 187164 DEBUG oslo_concurrency.processutils [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:00:16 np0005546954 nova_compute[187160]: 2025-12-05 13:00:16.615 187164 DEBUG oslo_concurrency.processutils [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:00:16 np0005546954 nova_compute[187160]: 2025-12-05 13:00:16.616 187164 DEBUG oslo_concurrency.lockutils [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:00:16 np0005546954 nova_compute[187160]: 2025-12-05 13:00:16.617 187164 DEBUG oslo_concurrency.lockutils [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:00:16 np0005546954 nova_compute[187160]: 2025-12-05 13:00:16.631 187164 DEBUG oslo_concurrency.processutils [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:00:16 np0005546954 nova_compute[187160]: 2025-12-05 13:00:16.654 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:00:16 np0005546954 nova_compute[187160]: 2025-12-05 13:00:16.686 187164 DEBUG oslo_concurrency.processutils [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:00:16 np0005546954 nova_compute[187160]: 2025-12-05 13:00:16.688 187164 DEBUG oslo_concurrency.processutils [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/a9e7b53a-45dd-415c-9977-b5df73cde5f2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:00:16 np0005546954 nova_compute[187160]: 2025-12-05 13:00:16.728 187164 DEBUG oslo_concurrency.processutils [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/a9e7b53a-45dd-415c-9977-b5df73cde5f2/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:00:16 np0005546954 nova_compute[187160]: 2025-12-05 13:00:16.730 187164 DEBUG oslo_concurrency.lockutils [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:00:16 np0005546954 nova_compute[187160]: 2025-12-05 13:00:16.730 187164 DEBUG oslo_concurrency.processutils [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:00:16 np0005546954 nova_compute[187160]: 2025-12-05 13:00:16.786 187164 DEBUG oslo_concurrency.processutils [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:00:16 np0005546954 nova_compute[187160]: 2025-12-05 13:00:16.788 187164 DEBUG nova.virt.disk.api [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Checking if we can resize image /var/lib/nova/instances/a9e7b53a-45dd-415c-9977-b5df73cde5f2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 08:00:16 np0005546954 nova_compute[187160]: 2025-12-05 13:00:16.789 187164 DEBUG oslo_concurrency.processutils [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a9e7b53a-45dd-415c-9977-b5df73cde5f2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:00:16 np0005546954 nova_compute[187160]: 2025-12-05 13:00:16.846 187164 DEBUG oslo_concurrency.processutils [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a9e7b53a-45dd-415c-9977-b5df73cde5f2/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:00:16 np0005546954 nova_compute[187160]: 2025-12-05 13:00:16.847 187164 DEBUG nova.virt.disk.api [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Cannot resize image /var/lib/nova/instances/a9e7b53a-45dd-415c-9977-b5df73cde5f2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 08:00:16 np0005546954 nova_compute[187160]: 2025-12-05 13:00:16.848 187164 DEBUG nova.objects.instance [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lazy-loading 'migration_context' on Instance uuid a9e7b53a-45dd-415c-9977-b5df73cde5f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 08:00:16 np0005546954 nova_compute[187160]: 2025-12-05 13:00:16.861 187164 DEBUG nova.virt.libvirt.driver [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 08:00:16 np0005546954 nova_compute[187160]: 2025-12-05 13:00:16.862 187164 DEBUG nova.virt.libvirt.driver [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Ensure instance console log exists: /var/lib/nova/instances/a9e7b53a-45dd-415c-9977-b5df73cde5f2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 08:00:16 np0005546954 nova_compute[187160]: 2025-12-05 13:00:16.862 187164 DEBUG oslo_concurrency.lockutils [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:00:16 np0005546954 nova_compute[187160]: 2025-12-05 13:00:16.863 187164 DEBUG oslo_concurrency.lockutils [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:00:16 np0005546954 nova_compute[187160]: 2025-12-05 13:00:16.864 187164 DEBUG oslo_concurrency.lockutils [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:00:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:00:16.960 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:00:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:00:16.960 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:00:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:00:16.960 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:00:17 np0005546954 nova_compute[187160]: 2025-12-05 13:00:17.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:00:17 np0005546954 nova_compute[187160]: 2025-12-05 13:00:17.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 08:00:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:00:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:00:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:00:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:00:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:00:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:00:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:00:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:00:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:00:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:00:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:00:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:00:19 np0005546954 nova_compute[187160]: 2025-12-05 13:00:19.443 187164 DEBUG nova.network.neutron [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Successfully created port: 254255b0-dc98-4210-b14e-8017dc43bccb _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 08:00:20 np0005546954 nova_compute[187160]: 2025-12-05 13:00:20.710 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:00:20 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:00:20.899 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2a:56:46', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '32:90:88:ab:74:32'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 08:00:20 np0005546954 nova_compute[187160]: 2025-12-05 13:00:20.900 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:00:20 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:00:20.900 104428 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 08:00:21 np0005546954 nova_compute[187160]: 2025-12-05 13:00:21.656 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:00:21 np0005546954 nova_compute[187160]: 2025-12-05 13:00:21.914 187164 DEBUG nova.network.neutron [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Successfully updated port: 254255b0-dc98-4210-b14e-8017dc43bccb _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 08:00:22 np0005546954 nova_compute[187160]: 2025-12-05 13:00:22.133 187164 DEBUG nova.compute.manager [req-6b163ef5-bfb7-4972-b19d-38fd71049513 req-446c3291-8bbd-4fa9-ae4e-f930b1cf8389 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Received event network-changed-254255b0-dc98-4210-b14e-8017dc43bccb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:00:22 np0005546954 nova_compute[187160]: 2025-12-05 13:00:22.133 187164 DEBUG nova.compute.manager [req-6b163ef5-bfb7-4972-b19d-38fd71049513 req-446c3291-8bbd-4fa9-ae4e-f930b1cf8389 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Refreshing instance network info cache due to event network-changed-254255b0-dc98-4210-b14e-8017dc43bccb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 08:00:22 np0005546954 nova_compute[187160]: 2025-12-05 13:00:22.134 187164 DEBUG oslo_concurrency.lockutils [req-6b163ef5-bfb7-4972-b19d-38fd71049513 req-446c3291-8bbd-4fa9-ae4e-f930b1cf8389 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "refresh_cache-a9e7b53a-45dd-415c-9977-b5df73cde5f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 08:00:22 np0005546954 nova_compute[187160]: 2025-12-05 13:00:22.134 187164 DEBUG oslo_concurrency.lockutils [req-6b163ef5-bfb7-4972-b19d-38fd71049513 req-446c3291-8bbd-4fa9-ae4e-f930b1cf8389 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquired lock "refresh_cache-a9e7b53a-45dd-415c-9977-b5df73cde5f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 08:00:22 np0005546954 nova_compute[187160]: 2025-12-05 13:00:22.135 187164 DEBUG nova.network.neutron [req-6b163ef5-bfb7-4972-b19d-38fd71049513 req-446c3291-8bbd-4fa9-ae4e-f930b1cf8389 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Refreshing network info cache for port 254255b0-dc98-4210-b14e-8017dc43bccb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 08:00:22 np0005546954 nova_compute[187160]: 2025-12-05 13:00:22.140 187164 DEBUG oslo_concurrency.lockutils [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "refresh_cache-a9e7b53a-45dd-415c-9977-b5df73cde5f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 08:00:22 np0005546954 podman[215371]: 2025-12-05 13:00:22.630031813 +0000 UTC m=+0.127427583 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  5 08:00:22 np0005546954 podman[215370]: 2025-12-05 13:00:22.632309783 +0000 UTC m=+0.132832021 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_id=edpm, name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, architecture=x86_64)
Dec  5 08:00:22 np0005546954 nova_compute[187160]: 2025-12-05 13:00:22.774 187164 DEBUG nova.network.neutron [req-6b163ef5-bfb7-4972-b19d-38fd71049513 req-446c3291-8bbd-4fa9-ae4e-f930b1cf8389 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 08:00:23 np0005546954 nova_compute[187160]: 2025-12-05 13:00:23.204 187164 DEBUG nova.network.neutron [req-6b163ef5-bfb7-4972-b19d-38fd71049513 req-446c3291-8bbd-4fa9-ae4e-f930b1cf8389 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 08:00:24 np0005546954 nova_compute[187160]: 2025-12-05 13:00:24.044 187164 DEBUG oslo_concurrency.lockutils [req-6b163ef5-bfb7-4972-b19d-38fd71049513 req-446c3291-8bbd-4fa9-ae4e-f930b1cf8389 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Releasing lock "refresh_cache-a9e7b53a-45dd-415c-9977-b5df73cde5f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 08:00:24 np0005546954 nova_compute[187160]: 2025-12-05 13:00:24.044 187164 DEBUG oslo_concurrency.lockutils [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquired lock "refresh_cache-a9e7b53a-45dd-415c-9977-b5df73cde5f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 08:00:24 np0005546954 nova_compute[187160]: 2025-12-05 13:00:24.045 187164 DEBUG nova.network.neutron [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 08:00:24 np0005546954 nova_compute[187160]: 2025-12-05 13:00:24.765 187164 DEBUG nova.network.neutron [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 08:00:25 np0005546954 nova_compute[187160]: 2025-12-05 13:00:25.563 187164 DEBUG nova.network.neutron [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Updating instance_info_cache with network_info: [{"id": "254255b0-dc98-4210-b14e-8017dc43bccb", "address": "fa:16:3e:54:54:2c", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap254255b0-dc", "ovs_interfaceid": "254255b0-dc98-4210-b14e-8017dc43bccb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 08:00:25 np0005546954 nova_compute[187160]: 2025-12-05 13:00:25.713 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.659 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.821 187164 DEBUG oslo_concurrency.lockutils [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Releasing lock "refresh_cache-a9e7b53a-45dd-415c-9977-b5df73cde5f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.822 187164 DEBUG nova.compute.manager [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Instance network_info: |[{"id": "254255b0-dc98-4210-b14e-8017dc43bccb", "address": "fa:16:3e:54:54:2c", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap254255b0-dc", "ovs_interfaceid": "254255b0-dc98-4210-b14e-8017dc43bccb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.825 187164 DEBUG nova.virt.libvirt.driver [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Start _get_guest_xml network_info=[{"id": "254255b0-dc98-4210-b14e-8017dc43bccb", "address": "fa:16:3e:54:54:2c", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap254255b0-dc", "ovs_interfaceid": "254255b0-dc98-4210-b14e-8017dc43bccb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T12:39:17Z,direct_url=<?>,disk_format='qcow2',id=f4c3125a-6fd0-40bb-aa00-a7e736ee853d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='83916c53de6f404f91206339303e1b23',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T12:39:21Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'size': 0, 'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'encrypted': False, 'image_id': 'f4c3125a-6fd0-40bb-aa00-a7e736ee853d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.831 187164 WARNING nova.virt.libvirt.driver [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.844 187164 DEBUG nova.virt.libvirt.host [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.845 187164 DEBUG nova.virt.libvirt.host [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.849 187164 DEBUG nova.virt.libvirt.host [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.850 187164 DEBUG nova.virt.libvirt.host [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.851 187164 DEBUG nova.virt.libvirt.driver [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.852 187164 DEBUG nova.virt.hardware [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T12:39:15Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b4ea63be-97f8-4a48-b000-66321c4ddb27',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T12:39:17Z,direct_url=<?>,disk_format='qcow2',id=f4c3125a-6fd0-40bb-aa00-a7e736ee853d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='83916c53de6f404f91206339303e1b23',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T12:39:21Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.852 187164 DEBUG nova.virt.hardware [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.853 187164 DEBUG nova.virt.hardware [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.853 187164 DEBUG nova.virt.hardware [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.853 187164 DEBUG nova.virt.hardware [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.854 187164 DEBUG nova.virt.hardware [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.854 187164 DEBUG nova.virt.hardware [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.854 187164 DEBUG nova.virt.hardware [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.854 187164 DEBUG nova.virt.hardware [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.855 187164 DEBUG nova.virt.hardware [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.855 187164 DEBUG nova.virt.hardware [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.859 187164 DEBUG nova.virt.libvirt.vif [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T13:00:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1525595846',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1525595846',id=20,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e6ae0d0dcde04b85b6dae45560cca988',ramdisk_id='',reservation_id='r-4143b8k5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-192029678',owner_user_name='tempest-TestExecuteStrategies-192029678-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T13:00:16Z,user_data=None,user_id='0ae0bb20ac8b4be99eb1abddc7310436',uuid=a9e7b53a-45dd-415c-9977-b5df73cde5f2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "254255b0-dc98-4210-b14e-8017dc43bccb", "address": "fa:16:3e:54:54:2c", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap254255b0-dc", "ovs_interfaceid": "254255b0-dc98-4210-b14e-8017dc43bccb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.860 187164 DEBUG nova.network.os_vif_util [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converting VIF {"id": "254255b0-dc98-4210-b14e-8017dc43bccb", "address": "fa:16:3e:54:54:2c", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap254255b0-dc", "ovs_interfaceid": "254255b0-dc98-4210-b14e-8017dc43bccb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.861 187164 DEBUG nova.network.os_vif_util [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:54:54:2c,bridge_name='br-int',has_traffic_filtering=True,id=254255b0-dc98-4210-b14e-8017dc43bccb,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap254255b0-dc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.862 187164 DEBUG nova.objects.instance [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lazy-loading 'pci_devices' on Instance uuid a9e7b53a-45dd-415c-9977-b5df73cde5f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.881 187164 DEBUG nova.virt.libvirt.driver [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] End _get_guest_xml xml=<domain type="kvm">
Dec  5 08:00:26 np0005546954 nova_compute[187160]:  <uuid>a9e7b53a-45dd-415c-9977-b5df73cde5f2</uuid>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:  <name>instance-00000014</name>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:  <memory>131072</memory>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:  <vcpu>1</vcpu>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:  <metadata>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 08:00:26 np0005546954 nova_compute[187160]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:      <nova:name>tempest-TestExecuteStrategies-server-1525595846</nova:name>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:      <nova:creationTime>2025-12-05 13:00:26</nova:creationTime>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:      <nova:flavor name="m1.nano">
Dec  5 08:00:26 np0005546954 nova_compute[187160]:        <nova:memory>128</nova:memory>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:        <nova:disk>1</nova:disk>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:        <nova:swap>0</nova:swap>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:        <nova:vcpus>1</nova:vcpus>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:      </nova:flavor>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:      <nova:owner>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:        <nova:user uuid="0ae0bb20ac8b4be99eb1abddc7310436">tempest-TestExecuteStrategies-192029678-project-member</nova:user>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:        <nova:project uuid="e6ae0d0dcde04b85b6dae45560cca988">tempest-TestExecuteStrategies-192029678</nova:project>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:      </nova:owner>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:      <nova:root type="image" uuid="f4c3125a-6fd0-40bb-aa00-a7e736ee853d"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:      <nova:ports>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:        <nova:port uuid="254255b0-dc98-4210-b14e-8017dc43bccb">
Dec  5 08:00:26 np0005546954 nova_compute[187160]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:        </nova:port>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:      </nova:ports>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    </nova:instance>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:  </metadata>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:  <sysinfo type="smbios">
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <system>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:      <entry name="manufacturer">RDO</entry>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:      <entry name="product">OpenStack Compute</entry>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:      <entry name="serial">a9e7b53a-45dd-415c-9977-b5df73cde5f2</entry>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:      <entry name="uuid">a9e7b53a-45dd-415c-9977-b5df73cde5f2</entry>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:      <entry name="family">Virtual Machine</entry>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    </system>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:  </sysinfo>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:  <os>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <boot dev="hd"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <smbios mode="sysinfo"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:  </os>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:  <features>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <acpi/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <apic/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <vmcoreinfo/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:  </features>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:  <clock offset="utc">
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <timer name="hpet" present="no"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:  </clock>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:  <cpu mode="custom" match="exact">
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <model>Nehalem</model>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:  </cpu>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:  <devices>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <disk type="file" device="disk">
Dec  5 08:00:26 np0005546954 nova_compute[187160]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:      <source file="/var/lib/nova/instances/a9e7b53a-45dd-415c-9977-b5df73cde5f2/disk"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:      <target dev="vda" bus="virtio"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    </disk>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <disk type="file" device="cdrom">
Dec  5 08:00:26 np0005546954 nova_compute[187160]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:      <source file="/var/lib/nova/instances/a9e7b53a-45dd-415c-9977-b5df73cde5f2/disk.config"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:      <target dev="sda" bus="sata"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    </disk>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <interface type="ethernet">
Dec  5 08:00:26 np0005546954 nova_compute[187160]:      <mac address="fa:16:3e:54:54:2c"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:      <model type="virtio"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:      <mtu size="1442"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:      <target dev="tap254255b0-dc"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    </interface>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <serial type="pty">
Dec  5 08:00:26 np0005546954 nova_compute[187160]:      <log file="/var/lib/nova/instances/a9e7b53a-45dd-415c-9977-b5df73cde5f2/console.log" append="off"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    </serial>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <video>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:      <model type="virtio"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    </video>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <input type="tablet" bus="usb"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <rng model="virtio">
Dec  5 08:00:26 np0005546954 nova_compute[187160]:      <backend model="random">/dev/urandom</backend>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    </rng>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <controller type="usb" index="0"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    <memballoon model="virtio">
Dec  5 08:00:26 np0005546954 nova_compute[187160]:      <stats period="10"/>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:    </memballoon>
Dec  5 08:00:26 np0005546954 nova_compute[187160]:  </devices>
Dec  5 08:00:26 np0005546954 nova_compute[187160]: </domain>
Dec  5 08:00:26 np0005546954 nova_compute[187160]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.883 187164 DEBUG nova.compute.manager [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Preparing to wait for external event network-vif-plugged-254255b0-dc98-4210-b14e-8017dc43bccb prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.884 187164 DEBUG oslo_concurrency.lockutils [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "a9e7b53a-45dd-415c-9977-b5df73cde5f2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.885 187164 DEBUG oslo_concurrency.lockutils [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "a9e7b53a-45dd-415c-9977-b5df73cde5f2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.886 187164 DEBUG oslo_concurrency.lockutils [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "a9e7b53a-45dd-415c-9977-b5df73cde5f2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.887 187164 DEBUG nova.virt.libvirt.vif [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T13:00:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1525595846',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1525595846',id=20,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e6ae0d0dcde04b85b6dae45560cca988',ramdisk_id='',reservation_id='r-4143b8k5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-192029678',owner_user_name='tempest-TestExecuteStrategies-192029678-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T13:00:16Z,user_data=None,user_id='0ae0bb20ac8b4be99eb1abddc7310436',uuid=a9e7b53a-45dd-415c-9977-b5df73cde5f2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "254255b0-dc98-4210-b14e-8017dc43bccb", "address": "fa:16:3e:54:54:2c", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap254255b0-dc", "ovs_interfaceid": "254255b0-dc98-4210-b14e-8017dc43bccb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.888 187164 DEBUG nova.network.os_vif_util [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converting VIF {"id": "254255b0-dc98-4210-b14e-8017dc43bccb", "address": "fa:16:3e:54:54:2c", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap254255b0-dc", "ovs_interfaceid": "254255b0-dc98-4210-b14e-8017dc43bccb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.889 187164 DEBUG nova.network.os_vif_util [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:54:54:2c,bridge_name='br-int',has_traffic_filtering=True,id=254255b0-dc98-4210-b14e-8017dc43bccb,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap254255b0-dc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.890 187164 DEBUG os_vif [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:54:54:2c,bridge_name='br-int',has_traffic_filtering=True,id=254255b0-dc98-4210-b14e-8017dc43bccb,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap254255b0-dc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.892 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.892 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.893 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.898 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.899 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap254255b0-dc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.900 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap254255b0-dc, col_values=(('external_ids', {'iface-id': '254255b0-dc98-4210-b14e-8017dc43bccb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:54:54:2c', 'vm-uuid': 'a9e7b53a-45dd-415c-9977-b5df73cde5f2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:00:26 np0005546954 NetworkManager[55665]: <info>  [1764939626.9035] manager: (tap254255b0-dc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/74)
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.903 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:00:26 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:00:26.905 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f9f74c-08f9-451f-9678-93bb9e8fa2fe, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.907 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.909 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.910 187164 INFO os_vif [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:54:54:2c,bridge_name='br-int',has_traffic_filtering=True,id=254255b0-dc98-4210-b14e-8017dc43bccb,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap254255b0-dc')#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.984 187164 DEBUG nova.virt.libvirt.driver [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.985 187164 DEBUG nova.virt.libvirt.driver [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.985 187164 DEBUG nova.virt.libvirt.driver [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] No VIF found with MAC fa:16:3e:54:54:2c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 08:00:26 np0005546954 nova_compute[187160]: 2025-12-05 13:00:26.985 187164 INFO nova.virt.libvirt.driver [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Using config drive#033[00m
Dec  5 08:00:27 np0005546954 nova_compute[187160]: 2025-12-05 13:00:27.273 187164 INFO nova.virt.libvirt.driver [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Creating config drive at /var/lib/nova/instances/a9e7b53a-45dd-415c-9977-b5df73cde5f2/disk.config#033[00m
Dec  5 08:00:27 np0005546954 nova_compute[187160]: 2025-12-05 13:00:27.278 187164 DEBUG oslo_concurrency.processutils [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a9e7b53a-45dd-415c-9977-b5df73cde5f2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5pkpfny8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:00:27 np0005546954 nova_compute[187160]: 2025-12-05 13:00:27.417 187164 DEBUG oslo_concurrency.processutils [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a9e7b53a-45dd-415c-9977-b5df73cde5f2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5pkpfny8" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:00:27 np0005546954 kernel: tap254255b0-dc: entered promiscuous mode
Dec  5 08:00:27 np0005546954 ovn_controller[95566]: 2025-12-05T13:00:27Z|00188|binding|INFO|Claiming lport 254255b0-dc98-4210-b14e-8017dc43bccb for this chassis.
Dec  5 08:00:27 np0005546954 nova_compute[187160]: 2025-12-05 13:00:27.501 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:00:27 np0005546954 NetworkManager[55665]: <info>  [1764939627.5020] manager: (tap254255b0-dc): new Tun device (/org/freedesktop/NetworkManager/Devices/75)
Dec  5 08:00:27 np0005546954 ovn_controller[95566]: 2025-12-05T13:00:27Z|00189|binding|INFO|254255b0-dc98-4210-b14e-8017dc43bccb: Claiming fa:16:3e:54:54:2c 10.100.0.14
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:00:27.510 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:54:54:2c 10.100.0.14'], port_security=['fa:16:3e:54:54:2c 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'a9e7b53a-45dd-415c-9977-b5df73cde5f2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4389bc8-2898-48b0-9741-5183b54fe83c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e6ae0d0dcde04b85b6dae45560cca988', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9ea68f98-ae7c-4c35-bc5a-7c1a27f7e5f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb60c317-acba-4c06-b29b-f7c6c7a5660a, chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=254255b0-dc98-4210-b14e-8017dc43bccb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:00:27.513 104428 INFO neutron.agent.ovn.metadata.agent [-] Port 254255b0-dc98-4210-b14e-8017dc43bccb in datapath d4389bc8-2898-48b0-9741-5183b54fe83c bound to our chassis#033[00m
Dec  5 08:00:27 np0005546954 ovn_controller[95566]: 2025-12-05T13:00:27Z|00190|binding|INFO|Setting lport 254255b0-dc98-4210-b14e-8017dc43bccb ovn-installed in OVS
Dec  5 08:00:27 np0005546954 ovn_controller[95566]: 2025-12-05T13:00:27Z|00191|binding|INFO|Setting lport 254255b0-dc98-4210-b14e-8017dc43bccb up in Southbound
Dec  5 08:00:27 np0005546954 nova_compute[187160]: 2025-12-05 13:00:27.515 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:00:27.517 104428 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d4389bc8-2898-48b0-9741-5183b54fe83c#033[00m
Dec  5 08:00:27 np0005546954 nova_compute[187160]: 2025-12-05 13:00:27.518 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:00:27.527 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[5709cdf7-d0c9-4741-bdb2-26a09ed4c10a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:00:27.529 104428 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd4389bc8-21 in ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 08:00:27 np0005546954 systemd-udevd[215425]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:00:27.533 208690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd4389bc8-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:00:27.533 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[b25200bc-fc21-4a9c-84cf-7ffcdb0a6588]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:00:27.535 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[5d5f45cb-ffd9-49cc-9f52-0555ac64dca5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:00:27.546 104542 DEBUG oslo.privsep.daemon [-] privsep: reply[99bd47b1-7685-479f-9c48-db7ce5d0b4bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:00:27 np0005546954 NetworkManager[55665]: <info>  [1764939627.5493] device (tap254255b0-dc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 08:00:27 np0005546954 NetworkManager[55665]: <info>  [1764939627.5510] device (tap254255b0-dc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 08:00:27 np0005546954 systemd-machined[153497]: New machine qemu-18-instance-00000014.
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:00:27.562 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[826bb5c3-7e67-464e-aa07-222e50a42252]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:00:27 np0005546954 systemd[1]: Started Virtual Machine qemu-18-instance-00000014.
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:00:27.599 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[bb67f030-88b3-451c-a4eb-77a445fa3aa1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:00:27.607 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[984dbd49-6712-4994-8d5d-8624eead0e45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:00:27 np0005546954 NetworkManager[55665]: <info>  [1764939627.6096] manager: (tapd4389bc8-20): new Veth device (/org/freedesktop/NetworkManager/Devices/76)
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:00:27.647 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[e32053a1-3deb-4cb9-91b5-80d9e8f2b183]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:00:27.650 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[0784a52c-18c0-4e92-9903-aee0a5e34ab2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:00:27 np0005546954 NetworkManager[55665]: <info>  [1764939627.6797] device (tapd4389bc8-20): carrier: link connected
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:00:27.685 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[5fb4886d-24e8-4975-8a4e-16bc78ee1aa8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:00:27.710 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[f48f0934-d97e-49c0-8d04-5abdbbe09d3b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd4389bc8-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:43:f7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 463430, 'reachable_time': 16106, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215459, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:00:27.732 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[22d59f00-bdb5-4f46-ba40-52c1db0e17f1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7c:43f7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 463430, 'tstamp': 463430}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215460, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:00:27.750 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[d2638b0e-3062-4292-8614-2fe1fbe02b8e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd4389bc8-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:43:f7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 463430, 'reachable_time': 16106, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215461, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:00:27.791 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[7d9173e3-2cf6-4b6f-8aa2-6d4c1dd21b27]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:00:27.866 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[ce5ef5d0-ff3d-43b6-9043-4724a8d619ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:00:27.868 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4389bc8-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:00:27.868 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:00:27.869 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd4389bc8-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:00:27 np0005546954 nova_compute[187160]: 2025-12-05 13:00:27.871 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:00:27 np0005546954 kernel: tapd4389bc8-20: entered promiscuous mode
Dec  5 08:00:27 np0005546954 NetworkManager[55665]: <info>  [1764939627.8728] manager: (tapd4389bc8-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/77)
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:00:27.875 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd4389bc8-20, col_values=(('external_ids', {'iface-id': '8dbe2af5-9f18-44ca-8f22-66854bcdd596'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:00:27 np0005546954 nova_compute[187160]: 2025-12-05 13:00:27.876 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:00:27 np0005546954 ovn_controller[95566]: 2025-12-05T13:00:27Z|00192|binding|INFO|Releasing lport 8dbe2af5-9f18-44ca-8f22-66854bcdd596 from this chassis (sb_readonly=0)
Dec  5 08:00:27 np0005546954 nova_compute[187160]: 2025-12-05 13:00:27.891 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:00:27.892 104428 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d4389bc8-2898-48b0-9741-5183b54fe83c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d4389bc8-2898-48b0-9741-5183b54fe83c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:00:27.893 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[f25ee197-12ee-40ea-a909-fce3dc1af51e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:00:27.894 104428 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]: global
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]:    log         /dev/log local0 debug
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]:    log-tag     haproxy-metadata-proxy-d4389bc8-2898-48b0-9741-5183b54fe83c
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]:    user        root
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]:    group       root
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]:    maxconn     1024
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]:    pidfile     /var/lib/neutron/external/pids/d4389bc8-2898-48b0-9741-5183b54fe83c.pid.haproxy
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]:    daemon
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]: 
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]: defaults
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]:    log global
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]:    mode http
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]:    option httplog
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]:    option dontlognull
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]:    option http-server-close
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]:    option forwardfor
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]:    retries                 3
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]:    timeout http-request    30s
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]:    timeout connect         30s
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]:    timeout client          32s
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]:    timeout server          32s
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]:    timeout http-keep-alive 30s
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]: 
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]: 
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]: listen listener
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]:    bind 169.254.169.254:80
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]:    http-request add-header X-OVN-Network-ID d4389bc8-2898-48b0-9741-5183b54fe83c
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 08:00:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:00:27.895 104428 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'env', 'PROCESS_TAG=haproxy-d4389bc8-2898-48b0-9741-5183b54fe83c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d4389bc8-2898-48b0-9741-5183b54fe83c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 08:00:27 np0005546954 nova_compute[187160]: 2025-12-05 13:00:27.926 187164 DEBUG nova.compute.manager [req-7069b7f1-7ea9-454e-962d-6692b08a6fb4 req-e35e09b6-c4d1-49d2-929a-dc40001cc1c2 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Received event network-vif-plugged-254255b0-dc98-4210-b14e-8017dc43bccb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:00:27 np0005546954 nova_compute[187160]: 2025-12-05 13:00:27.926 187164 DEBUG oslo_concurrency.lockutils [req-7069b7f1-7ea9-454e-962d-6692b08a6fb4 req-e35e09b6-c4d1-49d2-929a-dc40001cc1c2 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "a9e7b53a-45dd-415c-9977-b5df73cde5f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:00:27 np0005546954 nova_compute[187160]: 2025-12-05 13:00:27.927 187164 DEBUG oslo_concurrency.lockutils [req-7069b7f1-7ea9-454e-962d-6692b08a6fb4 req-e35e09b6-c4d1-49d2-929a-dc40001cc1c2 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "a9e7b53a-45dd-415c-9977-b5df73cde5f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:00:27 np0005546954 nova_compute[187160]: 2025-12-05 13:00:27.927 187164 DEBUG oslo_concurrency.lockutils [req-7069b7f1-7ea9-454e-962d-6692b08a6fb4 req-e35e09b6-c4d1-49d2-929a-dc40001cc1c2 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "a9e7b53a-45dd-415c-9977-b5df73cde5f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:00:27 np0005546954 nova_compute[187160]: 2025-12-05 13:00:27.927 187164 DEBUG nova.compute.manager [req-7069b7f1-7ea9-454e-962d-6692b08a6fb4 req-e35e09b6-c4d1-49d2-929a-dc40001cc1c2 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Processing event network-vif-plugged-254255b0-dc98-4210-b14e-8017dc43bccb _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 08:00:28 np0005546954 nova_compute[187160]: 2025-12-05 13:00:28.001 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764939628.001296, a9e7b53a-45dd-415c-9977-b5df73cde5f2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 08:00:28 np0005546954 nova_compute[187160]: 2025-12-05 13:00:28.002 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] VM Started (Lifecycle Event)#033[00m
Dec  5 08:00:28 np0005546954 nova_compute[187160]: 2025-12-05 13:00:28.004 187164 DEBUG nova.compute.manager [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 08:00:28 np0005546954 nova_compute[187160]: 2025-12-05 13:00:28.007 187164 DEBUG nova.virt.libvirt.driver [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 08:00:28 np0005546954 nova_compute[187160]: 2025-12-05 13:00:28.016 187164 INFO nova.virt.libvirt.driver [-] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Instance spawned successfully.#033[00m
Dec  5 08:00:28 np0005546954 nova_compute[187160]: 2025-12-05 13:00:28.016 187164 DEBUG nova.virt.libvirt.driver [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 08:00:28 np0005546954 nova_compute[187160]: 2025-12-05 13:00:28.019 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 08:00:28 np0005546954 nova_compute[187160]: 2025-12-05 13:00:28.022 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 08:00:28 np0005546954 nova_compute[187160]: 2025-12-05 13:00:28.030 187164 DEBUG nova.virt.libvirt.driver [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 08:00:28 np0005546954 nova_compute[187160]: 2025-12-05 13:00:28.031 187164 DEBUG nova.virt.libvirt.driver [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 08:00:28 np0005546954 nova_compute[187160]: 2025-12-05 13:00:28.031 187164 DEBUG nova.virt.libvirt.driver [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 08:00:28 np0005546954 nova_compute[187160]: 2025-12-05 13:00:28.032 187164 DEBUG nova.virt.libvirt.driver [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 08:00:28 np0005546954 nova_compute[187160]: 2025-12-05 13:00:28.032 187164 DEBUG nova.virt.libvirt.driver [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 08:00:28 np0005546954 nova_compute[187160]: 2025-12-05 13:00:28.032 187164 DEBUG nova.virt.libvirt.driver [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 08:00:28 np0005546954 nova_compute[187160]: 2025-12-05 13:00:28.038 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 08:00:28 np0005546954 nova_compute[187160]: 2025-12-05 13:00:28.038 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764939628.0014923, a9e7b53a-45dd-415c-9977-b5df73cde5f2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 08:00:28 np0005546954 nova_compute[187160]: 2025-12-05 13:00:28.038 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] VM Paused (Lifecycle Event)#033[00m
Dec  5 08:00:28 np0005546954 nova_compute[187160]: 2025-12-05 13:00:28.075 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 08:00:28 np0005546954 nova_compute[187160]: 2025-12-05 13:00:28.084 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764939628.006375, a9e7b53a-45dd-415c-9977-b5df73cde5f2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 08:00:28 np0005546954 nova_compute[187160]: 2025-12-05 13:00:28.084 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] VM Resumed (Lifecycle Event)#033[00m
Dec  5 08:00:28 np0005546954 nova_compute[187160]: 2025-12-05 13:00:28.107 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 08:00:28 np0005546954 nova_compute[187160]: 2025-12-05 13:00:28.114 187164 INFO nova.compute.manager [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Took 11.59 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 08:00:28 np0005546954 nova_compute[187160]: 2025-12-05 13:00:28.114 187164 DEBUG nova.compute.manager [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 08:00:28 np0005546954 nova_compute[187160]: 2025-12-05 13:00:28.116 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 08:00:28 np0005546954 nova_compute[187160]: 2025-12-05 13:00:28.161 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 08:00:28 np0005546954 nova_compute[187160]: 2025-12-05 13:00:28.197 187164 INFO nova.compute.manager [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Took 13.60 seconds to build instance.#033[00m
Dec  5 08:00:28 np0005546954 nova_compute[187160]: 2025-12-05 13:00:28.223 187164 DEBUG oslo_concurrency.lockutils [None req-1df1ebd6-faf8-40da-bd3a-0452c4aa69d5 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "a9e7b53a-45dd-415c-9977-b5df73cde5f2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.685s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:00:28 np0005546954 podman[215499]: 2025-12-05 13:00:28.309425582 +0000 UTC m=+0.063185531 container create b939c4b5a35ac424fbc52141e94182564d9ecd9c040231e62883ae1c1e1a880f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec  5 08:00:28 np0005546954 systemd[1]: Started libpod-conmon-b939c4b5a35ac424fbc52141e94182564d9ecd9c040231e62883ae1c1e1a880f.scope.
Dec  5 08:00:28 np0005546954 systemd[1]: Started libcrun container.
Dec  5 08:00:28 np0005546954 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b33b9ed2cc468ab57c78f3016fb90c5f420956c461da16fc2af35cefd45fe5c2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 08:00:28 np0005546954 podman[215499]: 2025-12-05 13:00:28.281250794 +0000 UTC m=+0.035010723 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 08:00:28 np0005546954 podman[215499]: 2025-12-05 13:00:28.382048096 +0000 UTC m=+0.135808045 container init b939c4b5a35ac424fbc52141e94182564d9ecd9c040231e62883ae1c1e1a880f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 08:00:28 np0005546954 podman[215499]: 2025-12-05 13:00:28.386918188 +0000 UTC m=+0.140678117 container start b939c4b5a35ac424fbc52141e94182564d9ecd9c040231e62883ae1c1e1a880f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 08:00:28 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[215512]: [NOTICE]   (215516) : New worker (215518) forked
Dec  5 08:00:28 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[215512]: [NOTICE]   (215516) : Loading success.
Dec  5 08:00:30 np0005546954 nova_compute[187160]: 2025-12-05 13:00:30.004 187164 DEBUG nova.compute.manager [req-4c609e4d-64a0-4af3-a367-28d5b3a3f034 req-e8460cd5-b8e8-4392-80f5-a1dc6542d702 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Received event network-vif-plugged-254255b0-dc98-4210-b14e-8017dc43bccb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:00:30 np0005546954 nova_compute[187160]: 2025-12-05 13:00:30.005 187164 DEBUG oslo_concurrency.lockutils [req-4c609e4d-64a0-4af3-a367-28d5b3a3f034 req-e8460cd5-b8e8-4392-80f5-a1dc6542d702 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "a9e7b53a-45dd-415c-9977-b5df73cde5f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:00:30 np0005546954 nova_compute[187160]: 2025-12-05 13:00:30.005 187164 DEBUG oslo_concurrency.lockutils [req-4c609e4d-64a0-4af3-a367-28d5b3a3f034 req-e8460cd5-b8e8-4392-80f5-a1dc6542d702 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "a9e7b53a-45dd-415c-9977-b5df73cde5f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:00:30 np0005546954 nova_compute[187160]: 2025-12-05 13:00:30.005 187164 DEBUG oslo_concurrency.lockutils [req-4c609e4d-64a0-4af3-a367-28d5b3a3f034 req-e8460cd5-b8e8-4392-80f5-a1dc6542d702 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "a9e7b53a-45dd-415c-9977-b5df73cde5f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:00:30 np0005546954 nova_compute[187160]: 2025-12-05 13:00:30.005 187164 DEBUG nova.compute.manager [req-4c609e4d-64a0-4af3-a367-28d5b3a3f034 req-e8460cd5-b8e8-4392-80f5-a1dc6542d702 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] No waiting events found dispatching network-vif-plugged-254255b0-dc98-4210-b14e-8017dc43bccb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 08:00:30 np0005546954 nova_compute[187160]: 2025-12-05 13:00:30.006 187164 WARNING nova.compute.manager [req-4c609e4d-64a0-4af3-a367-28d5b3a3f034 req-e8460cd5-b8e8-4392-80f5-a1dc6542d702 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Received unexpected event network-vif-plugged-254255b0-dc98-4210-b14e-8017dc43bccb for instance with vm_state active and task_state None.#033[00m
Dec  5 08:00:30 np0005546954 nova_compute[187160]: 2025-12-05 13:00:30.716 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:00:31 np0005546954 nova_compute[187160]: 2025-12-05 13:00:31.902 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:00:35 np0005546954 podman[215529]: 2025-12-05 13:00:35.553960902 +0000 UTC m=+0.055297564 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  5 08:00:35 np0005546954 podman[197513]: time="2025-12-05T13:00:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:00:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:00:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  5 08:00:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:00:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3057 "" "Go-http-client/1.1"
Dec  5 08:00:35 np0005546954 nova_compute[187160]: 2025-12-05 13:00:35.759 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:00:36 np0005546954 nova_compute[187160]: 2025-12-05 13:00:36.904 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:00:39 np0005546954 ovn_controller[95566]: 2025-12-05T13:00:39Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:54:54:2c 10.100.0.14
Dec  5 08:00:39 np0005546954 ovn_controller[95566]: 2025-12-05T13:00:39Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:54:54:2c 10.100.0.14
Dec  5 08:00:40 np0005546954 podman[215566]: 2025-12-05 13:00:40.634329435 +0000 UTC m=+0.120333292 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 08:00:40 np0005546954 podman[215565]: 2025-12-05 13:00:40.653754741 +0000 UTC m=+0.139734077 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  5 08:00:40 np0005546954 nova_compute[187160]: 2025-12-05 13:00:40.762 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:00:41 np0005546954 nova_compute[187160]: 2025-12-05 13:00:41.906 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:00:45 np0005546954 nova_compute[187160]: 2025-12-05 13:00:45.764 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:00:46 np0005546954 nova_compute[187160]: 2025-12-05 13:00:46.909 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:00:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:00:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:00:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:00:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:00:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:00:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:00:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:00:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:00:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:00:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:00:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:00:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:00:50 np0005546954 nova_compute[187160]: 2025-12-05 13:00:50.766 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:00:51 np0005546954 nova_compute[187160]: 2025-12-05 13:00:51.912 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:00:53 np0005546954 podman[215616]: 2025-12-05 13:00:53.53733761 +0000 UTC m=+0.054631734 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, name=ubi9-minimal, distribution-scope=public, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc.)
Dec  5 08:00:53 np0005546954 podman[215617]: 2025-12-05 13:00:53.568052388 +0000 UTC m=+0.070292603 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  5 08:00:55 np0005546954 nova_compute[187160]: 2025-12-05 13:00:55.770 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:00:56 np0005546954 nova_compute[187160]: 2025-12-05 13:00:56.915 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:00:57 np0005546954 ovn_controller[95566]: 2025-12-05T13:00:57Z|00193|memory_trim|INFO|Detected inactivity (last active 30009 ms ago): trimming memory
Dec  5 08:01:00 np0005546954 nova_compute[187160]: 2025-12-05 13:01:00.777 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:01:01 np0005546954 nova_compute[187160]: 2025-12-05 13:01:01.917 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:01:05 np0005546954 nova_compute[187160]: 2025-12-05 13:01:05.041 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:01:05 np0005546954 nova_compute[187160]: 2025-12-05 13:01:05.042 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 08:01:05 np0005546954 nova_compute[187160]: 2025-12-05 13:01:05.042 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 08:01:05 np0005546954 podman[197513]: time="2025-12-05T13:01:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:01:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:01:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  5 08:01:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:01:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3059 "" "Go-http-client/1.1"
Dec  5 08:01:05 np0005546954 nova_compute[187160]: 2025-12-05 13:01:05.762 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "refresh_cache-a9e7b53a-45dd-415c-9977-b5df73cde5f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 08:01:05 np0005546954 nova_compute[187160]: 2025-12-05 13:01:05.763 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquired lock "refresh_cache-a9e7b53a-45dd-415c-9977-b5df73cde5f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 08:01:05 np0005546954 nova_compute[187160]: 2025-12-05 13:01:05.763 187164 DEBUG nova.network.neutron [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec  5 08:01:05 np0005546954 nova_compute[187160]: 2025-12-05 13:01:05.763 187164 DEBUG nova.objects.instance [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a9e7b53a-45dd-415c-9977-b5df73cde5f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 08:01:05 np0005546954 nova_compute[187160]: 2025-12-05 13:01:05.782 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:01:06 np0005546954 podman[215669]: 2025-12-05 13:01:06.546251989 +0000 UTC m=+0.053772938 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec  5 08:01:06 np0005546954 nova_compute[187160]: 2025-12-05 13:01:06.918 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:01:08 np0005546954 nova_compute[187160]: 2025-12-05 13:01:08.092 187164 DEBUG nova.network.neutron [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Updating instance_info_cache with network_info: [{"id": "254255b0-dc98-4210-b14e-8017dc43bccb", "address": "fa:16:3e:54:54:2c", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap254255b0-dc", "ovs_interfaceid": "254255b0-dc98-4210-b14e-8017dc43bccb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 08:01:08 np0005546954 nova_compute[187160]: 2025-12-05 13:01:08.118 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Releasing lock "refresh_cache-a9e7b53a-45dd-415c-9977-b5df73cde5f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 08:01:08 np0005546954 nova_compute[187160]: 2025-12-05 13:01:08.118 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec  5 08:01:08 np0005546954 nova_compute[187160]: 2025-12-05 13:01:08.118 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:01:08 np0005546954 nova_compute[187160]: 2025-12-05 13:01:08.119 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:01:10 np0005546954 nova_compute[187160]: 2025-12-05 13:01:10.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:01:10 np0005546954 nova_compute[187160]: 2025-12-05 13:01:10.784 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:01:11 np0005546954 nova_compute[187160]: 2025-12-05 13:01:11.034 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:01:11 np0005546954 podman[215690]: 2025-12-05 13:01:11.574006415 +0000 UTC m=+0.079125498 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  5 08:01:11 np0005546954 podman[215689]: 2025-12-05 13:01:11.591504949 +0000 UTC m=+0.105670085 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec  5 08:01:11 np0005546954 nova_compute[187160]: 2025-12-05 13:01:11.920 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:01:12 np0005546954 nova_compute[187160]: 2025-12-05 13:01:12.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:01:12 np0005546954 nova_compute[187160]: 2025-12-05 13:01:12.071 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:01:12 np0005546954 nova_compute[187160]: 2025-12-05 13:01:12.072 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:01:12 np0005546954 nova_compute[187160]: 2025-12-05 13:01:12.072 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:01:12 np0005546954 nova_compute[187160]: 2025-12-05 13:01:12.072 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 08:01:12 np0005546954 nova_compute[187160]: 2025-12-05 13:01:12.109 187164 DEBUG nova.virt.libvirt.driver [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 125a08e9-c13a-4c0a-ad37-bda6cf4faf94] Creating tmpfile /var/lib/nova/instances/tmphvqypdbk to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Dec  5 08:01:12 np0005546954 nova_compute[187160]: 2025-12-05 13:01:12.110 187164 DEBUG nova.compute.manager [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphvqypdbk',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Dec  5 08:01:12 np0005546954 nova_compute[187160]: 2025-12-05 13:01:12.147 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a9e7b53a-45dd-415c-9977-b5df73cde5f2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:01:12 np0005546954 nova_compute[187160]: 2025-12-05 13:01:12.242 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a9e7b53a-45dd-415c-9977-b5df73cde5f2/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:01:12 np0005546954 nova_compute[187160]: 2025-12-05 13:01:12.243 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a9e7b53a-45dd-415c-9977-b5df73cde5f2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:01:12 np0005546954 nova_compute[187160]: 2025-12-05 13:01:12.296 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a9e7b53a-45dd-415c-9977-b5df73cde5f2/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:01:12 np0005546954 nova_compute[187160]: 2025-12-05 13:01:12.472 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 08:01:12 np0005546954 nova_compute[187160]: 2025-12-05 13:01:12.473 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5681MB free_disk=73.30697250366211GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 08:01:12 np0005546954 nova_compute[187160]: 2025-12-05 13:01:12.473 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:01:12 np0005546954 nova_compute[187160]: 2025-12-05 13:01:12.474 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:01:12 np0005546954 nova_compute[187160]: 2025-12-05 13:01:12.542 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Instance a9e7b53a-45dd-415c-9977-b5df73cde5f2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 08:01:12 np0005546954 nova_compute[187160]: 2025-12-05 13:01:12.565 187164 WARNING nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Instance 125a08e9-c13a-4c0a-ad37-bda6cf4faf94 has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.#033[00m
Dec  5 08:01:12 np0005546954 nova_compute[187160]: 2025-12-05 13:01:12.566 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 08:01:12 np0005546954 nova_compute[187160]: 2025-12-05 13:01:12.566 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 08:01:12 np0005546954 nova_compute[187160]: 2025-12-05 13:01:12.620 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 08:01:12 np0005546954 nova_compute[187160]: 2025-12-05 13:01:12.633 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 08:01:12 np0005546954 nova_compute[187160]: 2025-12-05 13:01:12.652 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 08:01:12 np0005546954 nova_compute[187160]: 2025-12-05 13:01:12.652 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.178s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:01:13 np0005546954 nova_compute[187160]: 2025-12-05 13:01:13.392 187164 DEBUG nova.compute.manager [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphvqypdbk',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='125a08e9-c13a-4c0a-ad37-bda6cf4faf94',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Dec  5 08:01:13 np0005546954 nova_compute[187160]: 2025-12-05 13:01:13.458 187164 DEBUG oslo_concurrency.lockutils [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "refresh_cache-125a08e9-c13a-4c0a-ad37-bda6cf4faf94" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 08:01:13 np0005546954 nova_compute[187160]: 2025-12-05 13:01:13.459 187164 DEBUG oslo_concurrency.lockutils [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquired lock "refresh_cache-125a08e9-c13a-4c0a-ad37-bda6cf4faf94" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 08:01:13 np0005546954 nova_compute[187160]: 2025-12-05 13:01:13.459 187164 DEBUG nova.network.neutron [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 125a08e9-c13a-4c0a-ad37-bda6cf4faf94] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 08:01:13 np0005546954 nova_compute[187160]: 2025-12-05 13:01:13.652 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:01:14 np0005546954 nova_compute[187160]: 2025-12-05 13:01:14.792 187164 DEBUG nova.network.neutron [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 125a08e9-c13a-4c0a-ad37-bda6cf4faf94] Updating instance_info_cache with network_info: [{"id": "26faa3c8-6110-4c71-bc52-e6408ab1ecee", "address": "fa:16:3e:37:8f:5f", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26faa3c8-61", "ovs_interfaceid": "26faa3c8-6110-4c71-bc52-e6408ab1ecee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 08:01:14 np0005546954 nova_compute[187160]: 2025-12-05 13:01:14.810 187164 DEBUG oslo_concurrency.lockutils [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Releasing lock "refresh_cache-125a08e9-c13a-4c0a-ad37-bda6cf4faf94" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 08:01:14 np0005546954 nova_compute[187160]: 2025-12-05 13:01:14.812 187164 DEBUG nova.virt.libvirt.driver [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 125a08e9-c13a-4c0a-ad37-bda6cf4faf94] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphvqypdbk',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='125a08e9-c13a-4c0a-ad37-bda6cf4faf94',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Dec  5 08:01:14 np0005546954 nova_compute[187160]: 2025-12-05 13:01:14.813 187164 DEBUG nova.virt.libvirt.driver [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 125a08e9-c13a-4c0a-ad37-bda6cf4faf94] Creating instance directory: /var/lib/nova/instances/125a08e9-c13a-4c0a-ad37-bda6cf4faf94 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Dec  5 08:01:14 np0005546954 nova_compute[187160]: 2025-12-05 13:01:14.813 187164 DEBUG nova.virt.libvirt.driver [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 125a08e9-c13a-4c0a-ad37-bda6cf4faf94] Creating disk.info with the contents: {'/var/lib/nova/instances/125a08e9-c13a-4c0a-ad37-bda6cf4faf94/disk': 'qcow2', '/var/lib/nova/instances/125a08e9-c13a-4c0a-ad37-bda6cf4faf94/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Dec  5 08:01:14 np0005546954 nova_compute[187160]: 2025-12-05 13:01:14.814 187164 DEBUG nova.virt.libvirt.driver [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 125a08e9-c13a-4c0a-ad37-bda6cf4faf94] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Dec  5 08:01:14 np0005546954 nova_compute[187160]: 2025-12-05 13:01:14.814 187164 DEBUG nova.objects.instance [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lazy-loading 'trusted_certs' on Instance uuid 125a08e9-c13a-4c0a-ad37-bda6cf4faf94 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 08:01:14 np0005546954 nova_compute[187160]: 2025-12-05 13:01:14.841 187164 DEBUG oslo_concurrency.processutils [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:01:14 np0005546954 nova_compute[187160]: 2025-12-05 13:01:14.939 187164 DEBUG oslo_concurrency.processutils [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:01:14 np0005546954 nova_compute[187160]: 2025-12-05 13:01:14.941 187164 DEBUG oslo_concurrency.lockutils [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:01:14 np0005546954 nova_compute[187160]: 2025-12-05 13:01:14.943 187164 DEBUG oslo_concurrency.lockutils [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:01:14 np0005546954 nova_compute[187160]: 2025-12-05 13:01:14.974 187164 DEBUG oslo_concurrency.processutils [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:01:15 np0005546954 nova_compute[187160]: 2025-12-05 13:01:15.038 187164 DEBUG oslo_concurrency.processutils [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:01:15 np0005546954 nova_compute[187160]: 2025-12-05 13:01:15.039 187164 DEBUG oslo_concurrency.processutils [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/125a08e9-c13a-4c0a-ad37-bda6cf4faf94/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:01:15 np0005546954 nova_compute[187160]: 2025-12-05 13:01:15.094 187164 DEBUG oslo_concurrency.processutils [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/125a08e9-c13a-4c0a-ad37-bda6cf4faf94/disk 1073741824" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:01:15 np0005546954 nova_compute[187160]: 2025-12-05 13:01:15.096 187164 DEBUG oslo_concurrency.lockutils [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:01:15 np0005546954 nova_compute[187160]: 2025-12-05 13:01:15.097 187164 DEBUG oslo_concurrency.processutils [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:01:15 np0005546954 nova_compute[187160]: 2025-12-05 13:01:15.151 187164 DEBUG oslo_concurrency.processutils [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:01:15 np0005546954 nova_compute[187160]: 2025-12-05 13:01:15.152 187164 DEBUG nova.virt.disk.api [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Checking if we can resize image /var/lib/nova/instances/125a08e9-c13a-4c0a-ad37-bda6cf4faf94/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 08:01:15 np0005546954 nova_compute[187160]: 2025-12-05 13:01:15.152 187164 DEBUG oslo_concurrency.processutils [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/125a08e9-c13a-4c0a-ad37-bda6cf4faf94/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:01:15 np0005546954 nova_compute[187160]: 2025-12-05 13:01:15.251 187164 DEBUG oslo_concurrency.processutils [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/125a08e9-c13a-4c0a-ad37-bda6cf4faf94/disk --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:01:15 np0005546954 nova_compute[187160]: 2025-12-05 13:01:15.252 187164 DEBUG nova.virt.disk.api [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Cannot resize image /var/lib/nova/instances/125a08e9-c13a-4c0a-ad37-bda6cf4faf94/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 08:01:15 np0005546954 nova_compute[187160]: 2025-12-05 13:01:15.253 187164 DEBUG nova.objects.instance [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lazy-loading 'migration_context' on Instance uuid 125a08e9-c13a-4c0a-ad37-bda6cf4faf94 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 08:01:15 np0005546954 nova_compute[187160]: 2025-12-05 13:01:15.276 187164 DEBUG oslo_concurrency.processutils [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/125a08e9-c13a-4c0a-ad37-bda6cf4faf94/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:01:15 np0005546954 nova_compute[187160]: 2025-12-05 13:01:15.310 187164 DEBUG oslo_concurrency.processutils [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/125a08e9-c13a-4c0a-ad37-bda6cf4faf94/disk.config 485376" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:01:15 np0005546954 nova_compute[187160]: 2025-12-05 13:01:15.312 187164 DEBUG nova.virt.libvirt.volume.remotefs [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/125a08e9-c13a-4c0a-ad37-bda6cf4faf94/disk.config to /var/lib/nova/instances/125a08e9-c13a-4c0a-ad37-bda6cf4faf94 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Dec  5 08:01:15 np0005546954 nova_compute[187160]: 2025-12-05 13:01:15.312 187164 DEBUG oslo_concurrency.processutils [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/125a08e9-c13a-4c0a-ad37-bda6cf4faf94/disk.config /var/lib/nova/instances/125a08e9-c13a-4c0a-ad37-bda6cf4faf94 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:01:15 np0005546954 nova_compute[187160]: 2025-12-05 13:01:15.772 187164 DEBUG oslo_concurrency.processutils [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/125a08e9-c13a-4c0a-ad37-bda6cf4faf94/disk.config /var/lib/nova/instances/125a08e9-c13a-4c0a-ad37-bda6cf4faf94" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:01:15 np0005546954 nova_compute[187160]: 2025-12-05 13:01:15.775 187164 DEBUG nova.virt.libvirt.driver [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 125a08e9-c13a-4c0a-ad37-bda6cf4faf94] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Dec  5 08:01:15 np0005546954 nova_compute[187160]: 2025-12-05 13:01:15.778 187164 DEBUG nova.virt.libvirt.vif [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:59:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1518533257',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1518533257',id=19,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T13:00:07Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e6ae0d0dcde04b85b6dae45560cca988',ramdisk_id='',reservation_id='r-fub9mvmj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-192029678',owner_user_name='tempest-TestExecuteStrategies-192029678-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T13:00:07Z,user_data=None,user_id='0ae0bb20ac8b4be99eb1abddc7310436',uuid=125a08e9-c13a-4c0a-ad37-bda6cf4faf94,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "26faa3c8-6110-4c71-bc52-e6408ab1ecee", "address": "fa:16:3e:37:8f:5f", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap26faa3c8-61", "ovs_interfaceid": "26faa3c8-6110-4c71-bc52-e6408ab1ecee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 08:01:15 np0005546954 nova_compute[187160]: 2025-12-05 13:01:15.779 187164 DEBUG nova.network.os_vif_util [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Converting VIF {"id": "26faa3c8-6110-4c71-bc52-e6408ab1ecee", "address": "fa:16:3e:37:8f:5f", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap26faa3c8-61", "ovs_interfaceid": "26faa3c8-6110-4c71-bc52-e6408ab1ecee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 08:01:15 np0005546954 nova_compute[187160]: 2025-12-05 13:01:15.781 187164 DEBUG nova.network.os_vif_util [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:37:8f:5f,bridge_name='br-int',has_traffic_filtering=True,id=26faa3c8-6110-4c71-bc52-e6408ab1ecee,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26faa3c8-61') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 08:01:15 np0005546954 nova_compute[187160]: 2025-12-05 13:01:15.782 187164 DEBUG os_vif [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:37:8f:5f,bridge_name='br-int',has_traffic_filtering=True,id=26faa3c8-6110-4c71-bc52-e6408ab1ecee,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26faa3c8-61') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 08:01:15 np0005546954 nova_compute[187160]: 2025-12-05 13:01:15.783 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:01:15 np0005546954 nova_compute[187160]: 2025-12-05 13:01:15.784 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:01:15 np0005546954 nova_compute[187160]: 2025-12-05 13:01:15.785 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 08:01:15 np0005546954 nova_compute[187160]: 2025-12-05 13:01:15.789 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:01:15 np0005546954 nova_compute[187160]: 2025-12-05 13:01:15.790 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap26faa3c8-61, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:01:15 np0005546954 nova_compute[187160]: 2025-12-05 13:01:15.791 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap26faa3c8-61, col_values=(('external_ids', {'iface-id': '26faa3c8-6110-4c71-bc52-e6408ab1ecee', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:37:8f:5f', 'vm-uuid': '125a08e9-c13a-4c0a-ad37-bda6cf4faf94'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:01:15 np0005546954 nova_compute[187160]: 2025-12-05 13:01:15.831 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:01:15 np0005546954 NetworkManager[55665]: <info>  [1764939675.8332] manager: (tap26faa3c8-61): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/78)
Dec  5 08:01:15 np0005546954 nova_compute[187160]: 2025-12-05 13:01:15.837 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 08:01:15 np0005546954 nova_compute[187160]: 2025-12-05 13:01:15.840 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:01:15 np0005546954 nova_compute[187160]: 2025-12-05 13:01:15.841 187164 INFO os_vif [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:37:8f:5f,bridge_name='br-int',has_traffic_filtering=True,id=26faa3c8-6110-4c71-bc52-e6408ab1ecee,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26faa3c8-61')#033[00m
Dec  5 08:01:15 np0005546954 nova_compute[187160]: 2025-12-05 13:01:15.842 187164 DEBUG nova.virt.libvirt.driver [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Dec  5 08:01:15 np0005546954 nova_compute[187160]: 2025-12-05 13:01:15.843 187164 DEBUG nova.compute.manager [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphvqypdbk',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='125a08e9-c13a-4c0a-ad37-bda6cf4faf94',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Dec  5 08:01:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:16.962 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:01:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:16.965 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:01:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:16.966 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:01:16 np0005546954 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec  5 08:01:18 np0005546954 nova_compute[187160]: 2025-12-05 13:01:18.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:01:18 np0005546954 nova_compute[187160]: 2025-12-05 13:01:18.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:01:18 np0005546954 nova_compute[187160]: 2025-12-05 13:01:18.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 08:01:19 np0005546954 nova_compute[187160]: 2025-12-05 13:01:19.008 187164 DEBUG nova.network.neutron [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 125a08e9-c13a-4c0a-ad37-bda6cf4faf94] Port 26faa3c8-6110-4c71-bc52-e6408ab1ecee updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Dec  5 08:01:19 np0005546954 nova_compute[187160]: 2025-12-05 13:01:19.011 187164 DEBUG nova.compute.manager [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphvqypdbk',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='125a08e9-c13a-4c0a-ad37-bda6cf4faf94',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Dec  5 08:01:19 np0005546954 nova_compute[187160]: 2025-12-05 13:01:19.035 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:01:19 np0005546954 systemd[1]: Starting libvirt proxy daemon...
Dec  5 08:01:19 np0005546954 systemd[1]: Started libvirt proxy daemon.
Dec  5 08:01:19 np0005546954 kernel: tap26faa3c8-61: entered promiscuous mode
Dec  5 08:01:19 np0005546954 NetworkManager[55665]: <info>  [1764939679.2927] manager: (tap26faa3c8-61): new Tun device (/org/freedesktop/NetworkManager/Devices/79)
Dec  5 08:01:19 np0005546954 ovn_controller[95566]: 2025-12-05T13:01:19Z|00194|binding|INFO|Claiming lport 26faa3c8-6110-4c71-bc52-e6408ab1ecee for this additional chassis.
Dec  5 08:01:19 np0005546954 ovn_controller[95566]: 2025-12-05T13:01:19Z|00195|binding|INFO|26faa3c8-6110-4c71-bc52-e6408ab1ecee: Claiming fa:16:3e:37:8f:5f 10.100.0.12
Dec  5 08:01:19 np0005546954 nova_compute[187160]: 2025-12-05 13:01:19.293 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:01:19 np0005546954 ovn_controller[95566]: 2025-12-05T13:01:19Z|00196|binding|INFO|Setting lport 26faa3c8-6110-4c71-bc52-e6408ab1ecee ovn-installed in OVS
Dec  5 08:01:19 np0005546954 nova_compute[187160]: 2025-12-05 13:01:19.306 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:01:19 np0005546954 nova_compute[187160]: 2025-12-05 13:01:19.308 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:01:19 np0005546954 systemd-machined[153497]: New machine qemu-19-instance-00000013.
Dec  5 08:01:19 np0005546954 systemd-udevd[215804]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 08:01:19 np0005546954 systemd[1]: Started Virtual Machine qemu-19-instance-00000013.
Dec  5 08:01:19 np0005546954 NetworkManager[55665]: <info>  [1764939679.3453] device (tap26faa3c8-61): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 08:01:19 np0005546954 NetworkManager[55665]: <info>  [1764939679.3466] device (tap26faa3c8-61): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 08:01:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:01:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:01:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:01:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:01:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:01:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:01:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:01:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:01:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:01:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:01:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:01:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:01:20 np0005546954 nova_compute[187160]: 2025-12-05 13:01:20.173 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764939680.1732092, 125a08e9-c13a-4c0a-ad37-bda6cf4faf94 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 08:01:20 np0005546954 nova_compute[187160]: 2025-12-05 13:01:20.175 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 125a08e9-c13a-4c0a-ad37-bda6cf4faf94] VM Started (Lifecycle Event)#033[00m
Dec  5 08:01:20 np0005546954 nova_compute[187160]: 2025-12-05 13:01:20.198 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 125a08e9-c13a-4c0a-ad37-bda6cf4faf94] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 08:01:20 np0005546954 nova_compute[187160]: 2025-12-05 13:01:20.833 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:01:20 np0005546954 nova_compute[187160]: 2025-12-05 13:01:20.979 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764939680.9785097, 125a08e9-c13a-4c0a-ad37-bda6cf4faf94 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 08:01:20 np0005546954 nova_compute[187160]: 2025-12-05 13:01:20.979 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 125a08e9-c13a-4c0a-ad37-bda6cf4faf94] VM Resumed (Lifecycle Event)#033[00m
Dec  5 08:01:21 np0005546954 nova_compute[187160]: 2025-12-05 13:01:21.004 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 125a08e9-c13a-4c0a-ad37-bda6cf4faf94] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 08:01:21 np0005546954 nova_compute[187160]: 2025-12-05 13:01:21.007 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 125a08e9-c13a-4c0a-ad37-bda6cf4faf94] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 08:01:21 np0005546954 nova_compute[187160]: 2025-12-05 13:01:21.033 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 125a08e9-c13a-4c0a-ad37-bda6cf4faf94] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Dec  5 08:01:21 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:21.434 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2a:56:46', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '32:90:88:ab:74:32'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 08:01:21 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:21.435 104428 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 08:01:21 np0005546954 nova_compute[187160]: 2025-12-05 13:01:21.488 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:01:24 np0005546954 podman[215836]: 2025-12-05 13:01:24.554614887 +0000 UTC m=+0.062074726 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Dec  5 08:01:24 np0005546954 podman[215835]: 2025-12-05 13:01:24.554872845 +0000 UTC m=+0.062297283 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, distribution-scope=public, io.buildah.version=1.33.7, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., release=1755695350, version=9.6)
Dec  5 08:01:25 np0005546954 nova_compute[187160]: 2025-12-05 13:01:25.837 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:01:27 np0005546954 ovn_controller[95566]: 2025-12-05T13:01:27Z|00197|binding|INFO|Claiming lport 26faa3c8-6110-4c71-bc52-e6408ab1ecee for this chassis.
Dec  5 08:01:27 np0005546954 ovn_controller[95566]: 2025-12-05T13:01:27Z|00198|binding|INFO|26faa3c8-6110-4c71-bc52-e6408ab1ecee: Claiming fa:16:3e:37:8f:5f 10.100.0.12
Dec  5 08:01:27 np0005546954 ovn_controller[95566]: 2025-12-05T13:01:27Z|00199|binding|INFO|Setting lport 26faa3c8-6110-4c71-bc52-e6408ab1ecee up in Southbound
Dec  5 08:01:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:27.208 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:37:8f:5f 10.100.0.12'], port_security=['fa:16:3e:37:8f:5f 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '125a08e9-c13a-4c0a-ad37-bda6cf4faf94', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4389bc8-2898-48b0-9741-5183b54fe83c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e6ae0d0dcde04b85b6dae45560cca988', 'neutron:revision_number': '11', 'neutron:security_group_ids': '9ea68f98-ae7c-4c35-bc5a-7c1a27f7e5f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb60c317-acba-4c06-b29b-f7c6c7a5660a, chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=26faa3c8-6110-4c71-bc52-e6408ab1ecee) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 08:01:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:27.209 104428 INFO neutron.agent.ovn.metadata.agent [-] Port 26faa3c8-6110-4c71-bc52-e6408ab1ecee in datapath d4389bc8-2898-48b0-9741-5183b54fe83c bound to our chassis#033[00m
Dec  5 08:01:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:27.210 104428 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d4389bc8-2898-48b0-9741-5183b54fe83c#033[00m
Dec  5 08:01:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:27.224 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[a7fe9769-d523-45a9-9a81-4743a46461b2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:01:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:27.253 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[fce541ce-0dbb-47c9-b00f-7e69e47e14b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:01:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:27.257 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[bf4a53b4-c238-4c40-8996-c3804a39ed4c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:01:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:27.281 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[39ab8f32-6211-48e8-89eb-30ad5b2f33e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:01:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:27.297 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[a2a8b0f7-1f02-4312-b267-8b51df5ac8da]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd4389bc8-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:43:f7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 463430, 'reachable_time': 16106, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215881, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:01:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:27.312 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[f15e1638-ec6d-4a91-a6c0-51166049c6c0]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd4389bc8-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 463445, 'tstamp': 463445}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215882, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd4389bc8-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 463448, 'tstamp': 463448}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215882, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:01:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:27.313 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4389bc8-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:01:27 np0005546954 nova_compute[187160]: 2025-12-05 13:01:27.362 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:01:27 np0005546954 nova_compute[187160]: 2025-12-05 13:01:27.364 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:01:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:27.364 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd4389bc8-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:01:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:27.364 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 08:01:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:27.365 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd4389bc8-20, col_values=(('external_ids', {'iface-id': '8dbe2af5-9f18-44ca-8f22-66854bcdd596'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:01:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:27.365 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 08:01:28 np0005546954 nova_compute[187160]: 2025-12-05 13:01:28.269 187164 INFO nova.compute.manager [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 125a08e9-c13a-4c0a-ad37-bda6cf4faf94] Post operation of migration started#033[00m
Dec  5 08:01:28 np0005546954 nova_compute[187160]: 2025-12-05 13:01:28.982 187164 DEBUG oslo_concurrency.lockutils [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "refresh_cache-125a08e9-c13a-4c0a-ad37-bda6cf4faf94" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 08:01:28 np0005546954 nova_compute[187160]: 2025-12-05 13:01:28.982 187164 DEBUG oslo_concurrency.lockutils [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquired lock "refresh_cache-125a08e9-c13a-4c0a-ad37-bda6cf4faf94" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 08:01:28 np0005546954 nova_compute[187160]: 2025-12-05 13:01:28.982 187164 DEBUG nova.network.neutron [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 125a08e9-c13a-4c0a-ad37-bda6cf4faf94] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 08:01:30 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:30.438 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f9f74c-08f9-451f-9678-93bb9e8fa2fe, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:01:30 np0005546954 nova_compute[187160]: 2025-12-05 13:01:30.839 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:01:31 np0005546954 nova_compute[187160]: 2025-12-05 13:01:31.076 187164 DEBUG nova.network.neutron [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 125a08e9-c13a-4c0a-ad37-bda6cf4faf94] Updating instance_info_cache with network_info: [{"id": "26faa3c8-6110-4c71-bc52-e6408ab1ecee", "address": "fa:16:3e:37:8f:5f", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26faa3c8-61", "ovs_interfaceid": "26faa3c8-6110-4c71-bc52-e6408ab1ecee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 08:01:31 np0005546954 nova_compute[187160]: 2025-12-05 13:01:31.107 187164 DEBUG oslo_concurrency.lockutils [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Releasing lock "refresh_cache-125a08e9-c13a-4c0a-ad37-bda6cf4faf94" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 08:01:31 np0005546954 nova_compute[187160]: 2025-12-05 13:01:31.128 187164 DEBUG oslo_concurrency.lockutils [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:01:31 np0005546954 nova_compute[187160]: 2025-12-05 13:01:31.129 187164 DEBUG oslo_concurrency.lockutils [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:01:31 np0005546954 nova_compute[187160]: 2025-12-05 13:01:31.129 187164 DEBUG oslo_concurrency.lockutils [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:01:31 np0005546954 nova_compute[187160]: 2025-12-05 13:01:31.133 187164 INFO nova.virt.libvirt.driver [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 125a08e9-c13a-4c0a-ad37-bda6cf4faf94] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Dec  5 08:01:31 np0005546954 virtqemud[186730]: Domain id=19 name='instance-00000013' uuid=125a08e9-c13a-4c0a-ad37-bda6cf4faf94 is tainted: custom-monitor
Dec  5 08:01:32 np0005546954 nova_compute[187160]: 2025-12-05 13:01:32.141 187164 INFO nova.virt.libvirt.driver [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 125a08e9-c13a-4c0a-ad37-bda6cf4faf94] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Dec  5 08:01:33 np0005546954 nova_compute[187160]: 2025-12-05 13:01:33.148 187164 INFO nova.virt.libvirt.driver [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 125a08e9-c13a-4c0a-ad37-bda6cf4faf94] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Dec  5 08:01:33 np0005546954 nova_compute[187160]: 2025-12-05 13:01:33.155 187164 DEBUG nova.compute.manager [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 125a08e9-c13a-4c0a-ad37-bda6cf4faf94] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 08:01:33 np0005546954 nova_compute[187160]: 2025-12-05 13:01:33.175 187164 DEBUG nova.objects.instance [None req-a71b6fc0-538d-4232-937d-65077dadea24 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 125a08e9-c13a-4c0a-ad37-bda6cf4faf94] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Dec  5 08:01:35 np0005546954 nova_compute[187160]: 2025-12-05 13:01:35.349 187164 DEBUG oslo_concurrency.lockutils [None req-d75a9774-225e-4668-9020-0a429b190550 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "a9e7b53a-45dd-415c-9977-b5df73cde5f2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:01:35 np0005546954 nova_compute[187160]: 2025-12-05 13:01:35.349 187164 DEBUG oslo_concurrency.lockutils [None req-d75a9774-225e-4668-9020-0a429b190550 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "a9e7b53a-45dd-415c-9977-b5df73cde5f2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:01:35 np0005546954 nova_compute[187160]: 2025-12-05 13:01:35.350 187164 DEBUG oslo_concurrency.lockutils [None req-d75a9774-225e-4668-9020-0a429b190550 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "a9e7b53a-45dd-415c-9977-b5df73cde5f2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:01:35 np0005546954 nova_compute[187160]: 2025-12-05 13:01:35.350 187164 DEBUG oslo_concurrency.lockutils [None req-d75a9774-225e-4668-9020-0a429b190550 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "a9e7b53a-45dd-415c-9977-b5df73cde5f2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:01:35 np0005546954 nova_compute[187160]: 2025-12-05 13:01:35.350 187164 DEBUG oslo_concurrency.lockutils [None req-d75a9774-225e-4668-9020-0a429b190550 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "a9e7b53a-45dd-415c-9977-b5df73cde5f2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:01:35 np0005546954 nova_compute[187160]: 2025-12-05 13:01:35.351 187164 INFO nova.compute.manager [None req-d75a9774-225e-4668-9020-0a429b190550 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Terminating instance#033[00m
Dec  5 08:01:35 np0005546954 nova_compute[187160]: 2025-12-05 13:01:35.352 187164 DEBUG nova.compute.manager [None req-d75a9774-225e-4668-9020-0a429b190550 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 08:01:35 np0005546954 kernel: tap254255b0-dc (unregistering): left promiscuous mode
Dec  5 08:01:35 np0005546954 NetworkManager[55665]: <info>  [1764939695.3732] device (tap254255b0-dc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 08:01:35 np0005546954 nova_compute[187160]: 2025-12-05 13:01:35.380 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:01:35 np0005546954 ovn_controller[95566]: 2025-12-05T13:01:35Z|00200|binding|INFO|Releasing lport 254255b0-dc98-4210-b14e-8017dc43bccb from this chassis (sb_readonly=0)
Dec  5 08:01:35 np0005546954 ovn_controller[95566]: 2025-12-05T13:01:35Z|00201|binding|INFO|Setting lport 254255b0-dc98-4210-b14e-8017dc43bccb down in Southbound
Dec  5 08:01:35 np0005546954 ovn_controller[95566]: 2025-12-05T13:01:35Z|00202|binding|INFO|Removing iface tap254255b0-dc ovn-installed in OVS
Dec  5 08:01:35 np0005546954 nova_compute[187160]: 2025-12-05 13:01:35.384 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:01:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:35.390 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:54:54:2c 10.100.0.14'], port_security=['fa:16:3e:54:54:2c 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'a9e7b53a-45dd-415c-9977-b5df73cde5f2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4389bc8-2898-48b0-9741-5183b54fe83c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e6ae0d0dcde04b85b6dae45560cca988', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9ea68f98-ae7c-4c35-bc5a-7c1a27f7e5f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb60c317-acba-4c06-b29b-f7c6c7a5660a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=254255b0-dc98-4210-b14e-8017dc43bccb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 08:01:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:35.391 104428 INFO neutron.agent.ovn.metadata.agent [-] Port 254255b0-dc98-4210-b14e-8017dc43bccb in datapath d4389bc8-2898-48b0-9741-5183b54fe83c unbound from our chassis#033[00m
Dec  5 08:01:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:35.393 104428 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d4389bc8-2898-48b0-9741-5183b54fe83c#033[00m
Dec  5 08:01:35 np0005546954 nova_compute[187160]: 2025-12-05 13:01:35.395 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:01:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:35.408 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[8766caa3-cd6d-4ef7-adde-e76e6543ae23]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:01:35 np0005546954 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000014.scope: Deactivated successfully.
Dec  5 08:01:35 np0005546954 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000014.scope: Consumed 14.486s CPU time.
Dec  5 08:01:35 np0005546954 systemd-machined[153497]: Machine qemu-18-instance-00000014 terminated.
Dec  5 08:01:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:35.434 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[2a2b9f24-1e1a-49f7-ad4a-0ab18fd026a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:01:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:35.437 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[3fc15703-4cf5-44c1-a373-54e95ae2125d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:01:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:35.464 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[f3910bdf-68e7-4382-aacd-85355f36e2d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:01:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:35.479 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[5ed5b9df-b3cf-4f9c-8505-ee09573cce83]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd4389bc8-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:43:f7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 463430, 'reachable_time': 16106, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215894, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:01:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:35.495 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[194ad9e6-0992-4cc5-bdd4-76aad1ff15a2]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd4389bc8-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 463445, 'tstamp': 463445}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215895, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd4389bc8-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 463448, 'tstamp': 463448}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215895, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:01:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:35.496 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4389bc8-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:01:35 np0005546954 nova_compute[187160]: 2025-12-05 13:01:35.497 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:01:35 np0005546954 nova_compute[187160]: 2025-12-05 13:01:35.501 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:01:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:35.502 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd4389bc8-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:01:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:35.502 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 08:01:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:35.502 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd4389bc8-20, col_values=(('external_ids', {'iface-id': '8dbe2af5-9f18-44ca-8f22-66854bcdd596'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:01:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:35.502 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 08:01:35 np0005546954 nova_compute[187160]: 2025-12-05 13:01:35.571 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:01:35 np0005546954 nova_compute[187160]: 2025-12-05 13:01:35.576 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:01:35 np0005546954 nova_compute[187160]: 2025-12-05 13:01:35.610 187164 INFO nova.virt.libvirt.driver [-] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Instance destroyed successfully.#033[00m
Dec  5 08:01:35 np0005546954 nova_compute[187160]: 2025-12-05 13:01:35.610 187164 DEBUG nova.objects.instance [None req-d75a9774-225e-4668-9020-0a429b190550 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lazy-loading 'resources' on Instance uuid a9e7b53a-45dd-415c-9977-b5df73cde5f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 08:01:35 np0005546954 nova_compute[187160]: 2025-12-05 13:01:35.630 187164 DEBUG nova.virt.libvirt.vif [None req-d75a9774-225e-4668-9020-0a429b190550 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T13:00:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1525595846',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1525595846',id=20,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T13:00:28Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e6ae0d0dcde04b85b6dae45560cca988',ramdisk_id='',reservation_id='r-4143b8k5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-192029678',owner_user_name='tempest-TestExecuteStrategies-192029678-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T13:00:28Z,user_data=None,user_id='0ae0bb20ac8b4be99eb1abddc7310436',uuid=a9e7b53a-45dd-415c-9977-b5df73cde5f2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "254255b0-dc98-4210-b14e-8017dc43bccb", "address": "fa:16:3e:54:54:2c", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap254255b0-dc", "ovs_interfaceid": "254255b0-dc98-4210-b14e-8017dc43bccb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 08:01:35 np0005546954 nova_compute[187160]: 2025-12-05 13:01:35.631 187164 DEBUG nova.network.os_vif_util [None req-d75a9774-225e-4668-9020-0a429b190550 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converting VIF {"id": "254255b0-dc98-4210-b14e-8017dc43bccb", "address": "fa:16:3e:54:54:2c", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap254255b0-dc", "ovs_interfaceid": "254255b0-dc98-4210-b14e-8017dc43bccb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 08:01:35 np0005546954 nova_compute[187160]: 2025-12-05 13:01:35.631 187164 DEBUG nova.network.os_vif_util [None req-d75a9774-225e-4668-9020-0a429b190550 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:54:54:2c,bridge_name='br-int',has_traffic_filtering=True,id=254255b0-dc98-4210-b14e-8017dc43bccb,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap254255b0-dc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 08:01:35 np0005546954 nova_compute[187160]: 2025-12-05 13:01:35.631 187164 DEBUG os_vif [None req-d75a9774-225e-4668-9020-0a429b190550 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:54:54:2c,bridge_name='br-int',has_traffic_filtering=True,id=254255b0-dc98-4210-b14e-8017dc43bccb,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap254255b0-dc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 08:01:35 np0005546954 nova_compute[187160]: 2025-12-05 13:01:35.633 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:01:35 np0005546954 nova_compute[187160]: 2025-12-05 13:01:35.633 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap254255b0-dc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:01:35 np0005546954 nova_compute[187160]: 2025-12-05 13:01:35.634 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:01:35 np0005546954 nova_compute[187160]: 2025-12-05 13:01:35.635 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:01:35 np0005546954 nova_compute[187160]: 2025-12-05 13:01:35.637 187164 INFO os_vif [None req-d75a9774-225e-4668-9020-0a429b190550 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:54:54:2c,bridge_name='br-int',has_traffic_filtering=True,id=254255b0-dc98-4210-b14e-8017dc43bccb,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap254255b0-dc')#033[00m
Dec  5 08:01:35 np0005546954 nova_compute[187160]: 2025-12-05 13:01:35.638 187164 INFO nova.virt.libvirt.driver [None req-d75a9774-225e-4668-9020-0a429b190550 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Deleting instance files /var/lib/nova/instances/a9e7b53a-45dd-415c-9977-b5df73cde5f2_del#033[00m
Dec  5 08:01:35 np0005546954 nova_compute[187160]: 2025-12-05 13:01:35.638 187164 INFO nova.virt.libvirt.driver [None req-d75a9774-225e-4668-9020-0a429b190550 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Deletion of /var/lib/nova/instances/a9e7b53a-45dd-415c-9977-b5df73cde5f2_del complete#033[00m
Dec  5 08:01:35 np0005546954 podman[197513]: time="2025-12-05T13:01:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:01:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:01:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  5 08:01:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:01:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3057 "" "Go-http-client/1.1"
Dec  5 08:01:35 np0005546954 nova_compute[187160]: 2025-12-05 13:01:35.707 187164 INFO nova.compute.manager [None req-d75a9774-225e-4668-9020-0a429b190550 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Took 0.35 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 08:01:35 np0005546954 nova_compute[187160]: 2025-12-05 13:01:35.708 187164 DEBUG oslo.service.loopingcall [None req-d75a9774-225e-4668-9020-0a429b190550 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 08:01:35 np0005546954 nova_compute[187160]: 2025-12-05 13:01:35.708 187164 DEBUG nova.compute.manager [-] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 08:01:35 np0005546954 nova_compute[187160]: 2025-12-05 13:01:35.708 187164 DEBUG nova.network.neutron [-] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 08:01:35 np0005546954 nova_compute[187160]: 2025-12-05 13:01:35.840 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:01:36 np0005546954 nova_compute[187160]: 2025-12-05 13:01:36.016 187164 DEBUG nova.compute.manager [req-e52ed982-f42b-4cd5-8ec6-e16a14529e67 req-4570fa57-1895-45e1-83ee-500fc77528c3 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Received event network-vif-unplugged-254255b0-dc98-4210-b14e-8017dc43bccb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:01:36 np0005546954 nova_compute[187160]: 2025-12-05 13:01:36.017 187164 DEBUG oslo_concurrency.lockutils [req-e52ed982-f42b-4cd5-8ec6-e16a14529e67 req-4570fa57-1895-45e1-83ee-500fc77528c3 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "a9e7b53a-45dd-415c-9977-b5df73cde5f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:01:36 np0005546954 nova_compute[187160]: 2025-12-05 13:01:36.017 187164 DEBUG oslo_concurrency.lockutils [req-e52ed982-f42b-4cd5-8ec6-e16a14529e67 req-4570fa57-1895-45e1-83ee-500fc77528c3 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "a9e7b53a-45dd-415c-9977-b5df73cde5f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:01:36 np0005546954 nova_compute[187160]: 2025-12-05 13:01:36.018 187164 DEBUG oslo_concurrency.lockutils [req-e52ed982-f42b-4cd5-8ec6-e16a14529e67 req-4570fa57-1895-45e1-83ee-500fc77528c3 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "a9e7b53a-45dd-415c-9977-b5df73cde5f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:01:36 np0005546954 nova_compute[187160]: 2025-12-05 13:01:36.018 187164 DEBUG nova.compute.manager [req-e52ed982-f42b-4cd5-8ec6-e16a14529e67 req-4570fa57-1895-45e1-83ee-500fc77528c3 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] No waiting events found dispatching network-vif-unplugged-254255b0-dc98-4210-b14e-8017dc43bccb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 08:01:36 np0005546954 nova_compute[187160]: 2025-12-05 13:01:36.019 187164 DEBUG nova.compute.manager [req-e52ed982-f42b-4cd5-8ec6-e16a14529e67 req-4570fa57-1895-45e1-83ee-500fc77528c3 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Received event network-vif-unplugged-254255b0-dc98-4210-b14e-8017dc43bccb for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  5 08:01:36 np0005546954 nova_compute[187160]: 2025-12-05 13:01:36.856 187164 DEBUG nova.network.neutron [-] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 08:01:36 np0005546954 nova_compute[187160]: 2025-12-05 13:01:36.887 187164 INFO nova.compute.manager [-] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Took 1.18 seconds to deallocate network for instance.#033[00m
Dec  5 08:01:36 np0005546954 nova_compute[187160]: 2025-12-05 13:01:36.942 187164 DEBUG oslo_concurrency.lockutils [None req-d75a9774-225e-4668-9020-0a429b190550 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:01:36 np0005546954 nova_compute[187160]: 2025-12-05 13:01:36.942 187164 DEBUG oslo_concurrency.lockutils [None req-d75a9774-225e-4668-9020-0a429b190550 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:01:37 np0005546954 nova_compute[187160]: 2025-12-05 13:01:37.002 187164 DEBUG nova.compute.provider_tree [None req-d75a9774-225e-4668-9020-0a429b190550 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 08:01:37 np0005546954 nova_compute[187160]: 2025-12-05 13:01:37.020 187164 DEBUG nova.scheduler.client.report [None req-d75a9774-225e-4668-9020-0a429b190550 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 08:01:37 np0005546954 nova_compute[187160]: 2025-12-05 13:01:37.042 187164 DEBUG oslo_concurrency.lockutils [None req-d75a9774-225e-4668-9020-0a429b190550 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:01:37 np0005546954 nova_compute[187160]: 2025-12-05 13:01:37.066 187164 INFO nova.scheduler.client.report [None req-d75a9774-225e-4668-9020-0a429b190550 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Deleted allocations for instance a9e7b53a-45dd-415c-9977-b5df73cde5f2#033[00m
Dec  5 08:01:37 np0005546954 nova_compute[187160]: 2025-12-05 13:01:37.135 187164 DEBUG oslo_concurrency.lockutils [None req-d75a9774-225e-4668-9020-0a429b190550 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "a9e7b53a-45dd-415c-9977-b5df73cde5f2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.786s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:01:37 np0005546954 nova_compute[187160]: 2025-12-05 13:01:37.530 187164 DEBUG oslo_concurrency.lockutils [None req-5b602aab-7a49-4ed7-af2e-112d0bc5b3e2 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "125a08e9-c13a-4c0a-ad37-bda6cf4faf94" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:01:37 np0005546954 nova_compute[187160]: 2025-12-05 13:01:37.530 187164 DEBUG oslo_concurrency.lockutils [None req-5b602aab-7a49-4ed7-af2e-112d0bc5b3e2 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "125a08e9-c13a-4c0a-ad37-bda6cf4faf94" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:01:37 np0005546954 nova_compute[187160]: 2025-12-05 13:01:37.530 187164 DEBUG oslo_concurrency.lockutils [None req-5b602aab-7a49-4ed7-af2e-112d0bc5b3e2 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "125a08e9-c13a-4c0a-ad37-bda6cf4faf94-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:01:37 np0005546954 nova_compute[187160]: 2025-12-05 13:01:37.530 187164 DEBUG oslo_concurrency.lockutils [None req-5b602aab-7a49-4ed7-af2e-112d0bc5b3e2 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "125a08e9-c13a-4c0a-ad37-bda6cf4faf94-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:01:37 np0005546954 nova_compute[187160]: 2025-12-05 13:01:37.531 187164 DEBUG oslo_concurrency.lockutils [None req-5b602aab-7a49-4ed7-af2e-112d0bc5b3e2 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "125a08e9-c13a-4c0a-ad37-bda6cf4faf94-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:01:37 np0005546954 nova_compute[187160]: 2025-12-05 13:01:37.531 187164 INFO nova.compute.manager [None req-5b602aab-7a49-4ed7-af2e-112d0bc5b3e2 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 125a08e9-c13a-4c0a-ad37-bda6cf4faf94] Terminating instance#033[00m
Dec  5 08:01:37 np0005546954 nova_compute[187160]: 2025-12-05 13:01:37.532 187164 DEBUG nova.compute.manager [None req-5b602aab-7a49-4ed7-af2e-112d0bc5b3e2 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 125a08e9-c13a-4c0a-ad37-bda6cf4faf94] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 08:01:37 np0005546954 kernel: tap26faa3c8-61 (unregistering): left promiscuous mode
Dec  5 08:01:37 np0005546954 NetworkManager[55665]: <info>  [1764939697.5556] device (tap26faa3c8-61): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 08:01:37 np0005546954 podman[215913]: 2025-12-05 13:01:37.583408659 +0000 UTC m=+0.090494082 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec  5 08:01:37 np0005546954 ovn_controller[95566]: 2025-12-05T13:01:37Z|00203|binding|INFO|Releasing lport 26faa3c8-6110-4c71-bc52-e6408ab1ecee from this chassis (sb_readonly=0)
Dec  5 08:01:37 np0005546954 ovn_controller[95566]: 2025-12-05T13:01:37Z|00204|binding|INFO|Setting lport 26faa3c8-6110-4c71-bc52-e6408ab1ecee down in Southbound
Dec  5 08:01:37 np0005546954 ovn_controller[95566]: 2025-12-05T13:01:37Z|00205|binding|INFO|Removing iface tap26faa3c8-61 ovn-installed in OVS
Dec  5 08:01:37 np0005546954 nova_compute[187160]: 2025-12-05 13:01:37.671 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:01:37 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:37.680 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:37:8f:5f 10.100.0.12'], port_security=['fa:16:3e:37:8f:5f 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '125a08e9-c13a-4c0a-ad37-bda6cf4faf94', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4389bc8-2898-48b0-9741-5183b54fe83c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e6ae0d0dcde04b85b6dae45560cca988', 'neutron:revision_number': '13', 'neutron:security_group_ids': '9ea68f98-ae7c-4c35-bc5a-7c1a27f7e5f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb60c317-acba-4c06-b29b-f7c6c7a5660a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=26faa3c8-6110-4c71-bc52-e6408ab1ecee) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 08:01:37 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:37.681 104428 INFO neutron.agent.ovn.metadata.agent [-] Port 26faa3c8-6110-4c71-bc52-e6408ab1ecee in datapath d4389bc8-2898-48b0-9741-5183b54fe83c unbound from our chassis#033[00m
Dec  5 08:01:37 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:37.682 104428 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d4389bc8-2898-48b0-9741-5183b54fe83c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 08:01:37 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:37.683 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[23396755-6810-4d8e-8d87-be085eb6831c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:01:37 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:37.684 104428 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c namespace which is not needed anymore#033[00m
Dec  5 08:01:37 np0005546954 nova_compute[187160]: 2025-12-05 13:01:37.686 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:01:37 np0005546954 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000013.scope: Deactivated successfully.
Dec  5 08:01:37 np0005546954 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000013.scope: Consumed 1.939s CPU time.
Dec  5 08:01:37 np0005546954 systemd-machined[153497]: Machine qemu-19-instance-00000013 terminated.
Dec  5 08:01:37 np0005546954 NetworkManager[55665]: <info>  [1764939697.7553] manager: (tap26faa3c8-61): new Tun device (/org/freedesktop/NetworkManager/Devices/80)
Dec  5 08:01:37 np0005546954 nova_compute[187160]: 2025-12-05 13:01:37.810 187164 INFO nova.virt.libvirt.driver [-] [instance: 125a08e9-c13a-4c0a-ad37-bda6cf4faf94] Instance destroyed successfully.#033[00m
Dec  5 08:01:37 np0005546954 nova_compute[187160]: 2025-12-05 13:01:37.812 187164 DEBUG nova.objects.instance [None req-5b602aab-7a49-4ed7-af2e-112d0bc5b3e2 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lazy-loading 'resources' on Instance uuid 125a08e9-c13a-4c0a-ad37-bda6cf4faf94 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 08:01:37 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[215512]: [NOTICE]   (215516) : haproxy version is 2.8.14-c23fe91
Dec  5 08:01:37 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[215512]: [NOTICE]   (215516) : path to executable is /usr/sbin/haproxy
Dec  5 08:01:37 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[215512]: [WARNING]  (215516) : Exiting Master process...
Dec  5 08:01:37 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[215512]: [ALERT]    (215516) : Current worker (215518) exited with code 143 (Terminated)
Dec  5 08:01:37 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[215512]: [WARNING]  (215516) : All workers exited. Exiting... (0)
Dec  5 08:01:37 np0005546954 systemd[1]: libpod-b939c4b5a35ac424fbc52141e94182564d9ecd9c040231e62883ae1c1e1a880f.scope: Deactivated successfully.
Dec  5 08:01:37 np0005546954 nova_compute[187160]: 2025-12-05 13:01:37.827 187164 DEBUG nova.virt.libvirt.vif [None req-5b602aab-7a49-4ed7-af2e-112d0bc5b3e2 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T12:59:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1518533257',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1518533257',id=19,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T13:00:07Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e6ae0d0dcde04b85b6dae45560cca988',ramdisk_id='',reservation_id='r-fub9mvmj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-192029678',owner_user_name='tempest-TestExecuteStrategies-192029678-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T13:01:33Z,user_data=None,user_id='0ae0bb20ac8b4be99eb1abddc7310436',uuid=125a08e9-c13a-4c0a-ad37-bda6cf4faf94,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "26faa3c8-6110-4c71-bc52-e6408ab1ecee", "address": "fa:16:3e:37:8f:5f", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26faa3c8-61", "ovs_interfaceid": "26faa3c8-6110-4c71-bc52-e6408ab1ecee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 08:01:37 np0005546954 nova_compute[187160]: 2025-12-05 13:01:37.828 187164 DEBUG nova.network.os_vif_util [None req-5b602aab-7a49-4ed7-af2e-112d0bc5b3e2 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converting VIF {"id": "26faa3c8-6110-4c71-bc52-e6408ab1ecee", "address": "fa:16:3e:37:8f:5f", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26faa3c8-61", "ovs_interfaceid": "26faa3c8-6110-4c71-bc52-e6408ab1ecee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 08:01:37 np0005546954 podman[215960]: 2025-12-05 13:01:37.828969704 +0000 UTC m=+0.048198523 container died b939c4b5a35ac424fbc52141e94182564d9ecd9c040231e62883ae1c1e1a880f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec  5 08:01:37 np0005546954 nova_compute[187160]: 2025-12-05 13:01:37.829 187164 DEBUG nova.network.os_vif_util [None req-5b602aab-7a49-4ed7-af2e-112d0bc5b3e2 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:37:8f:5f,bridge_name='br-int',has_traffic_filtering=True,id=26faa3c8-6110-4c71-bc52-e6408ab1ecee,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26faa3c8-61') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 08:01:37 np0005546954 nova_compute[187160]: 2025-12-05 13:01:37.829 187164 DEBUG os_vif [None req-5b602aab-7a49-4ed7-af2e-112d0bc5b3e2 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:37:8f:5f,bridge_name='br-int',has_traffic_filtering=True,id=26faa3c8-6110-4c71-bc52-e6408ab1ecee,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26faa3c8-61') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 08:01:37 np0005546954 nova_compute[187160]: 2025-12-05 13:01:37.830 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:01:37 np0005546954 nova_compute[187160]: 2025-12-05 13:01:37.831 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap26faa3c8-61, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:01:37 np0005546954 nova_compute[187160]: 2025-12-05 13:01:37.832 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:01:37 np0005546954 nova_compute[187160]: 2025-12-05 13:01:37.833 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:01:37 np0005546954 nova_compute[187160]: 2025-12-05 13:01:37.835 187164 INFO os_vif [None req-5b602aab-7a49-4ed7-af2e-112d0bc5b3e2 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:37:8f:5f,bridge_name='br-int',has_traffic_filtering=True,id=26faa3c8-6110-4c71-bc52-e6408ab1ecee,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26faa3c8-61')#033[00m
Dec  5 08:01:37 np0005546954 nova_compute[187160]: 2025-12-05 13:01:37.835 187164 INFO nova.virt.libvirt.driver [None req-5b602aab-7a49-4ed7-af2e-112d0bc5b3e2 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 125a08e9-c13a-4c0a-ad37-bda6cf4faf94] Deleting instance files /var/lib/nova/instances/125a08e9-c13a-4c0a-ad37-bda6cf4faf94_del#033[00m
Dec  5 08:01:37 np0005546954 nova_compute[187160]: 2025-12-05 13:01:37.836 187164 INFO nova.virt.libvirt.driver [None req-5b602aab-7a49-4ed7-af2e-112d0bc5b3e2 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 125a08e9-c13a-4c0a-ad37-bda6cf4faf94] Deletion of /var/lib/nova/instances/125a08e9-c13a-4c0a-ad37-bda6cf4faf94_del complete#033[00m
Dec  5 08:01:37 np0005546954 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b939c4b5a35ac424fbc52141e94182564d9ecd9c040231e62883ae1c1e1a880f-userdata-shm.mount: Deactivated successfully.
Dec  5 08:01:37 np0005546954 systemd[1]: var-lib-containers-storage-overlay-b33b9ed2cc468ab57c78f3016fb90c5f420956c461da16fc2af35cefd45fe5c2-merged.mount: Deactivated successfully.
Dec  5 08:01:37 np0005546954 podman[215960]: 2025-12-05 13:01:37.865346668 +0000 UTC m=+0.084575497 container cleanup b939c4b5a35ac424fbc52141e94182564d9ecd9c040231e62883ae1c1e1a880f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec  5 08:01:37 np0005546954 systemd[1]: libpod-conmon-b939c4b5a35ac424fbc52141e94182564d9ecd9c040231e62883ae1c1e1a880f.scope: Deactivated successfully.
Dec  5 08:01:37 np0005546954 nova_compute[187160]: 2025-12-05 13:01:37.895 187164 INFO nova.compute.manager [None req-5b602aab-7a49-4ed7-af2e-112d0bc5b3e2 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 125a08e9-c13a-4c0a-ad37-bda6cf4faf94] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 08:01:37 np0005546954 nova_compute[187160]: 2025-12-05 13:01:37.896 187164 DEBUG oslo.service.loopingcall [None req-5b602aab-7a49-4ed7-af2e-112d0bc5b3e2 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 08:01:37 np0005546954 nova_compute[187160]: 2025-12-05 13:01:37.896 187164 DEBUG nova.compute.manager [-] [instance: 125a08e9-c13a-4c0a-ad37-bda6cf4faf94] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 08:01:37 np0005546954 nova_compute[187160]: 2025-12-05 13:01:37.896 187164 DEBUG nova.network.neutron [-] [instance: 125a08e9-c13a-4c0a-ad37-bda6cf4faf94] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 08:01:37 np0005546954 podman[215999]: 2025-12-05 13:01:37.925577056 +0000 UTC m=+0.039155111 container remove b939c4b5a35ac424fbc52141e94182564d9ecd9c040231e62883ae1c1e1a880f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec  5 08:01:37 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:37.930 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[77971c42-80ee-4569-a9cd-b6add8682b07]: (4, ('Fri Dec  5 01:01:37 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c (b939c4b5a35ac424fbc52141e94182564d9ecd9c040231e62883ae1c1e1a880f)\nb939c4b5a35ac424fbc52141e94182564d9ecd9c040231e62883ae1c1e1a880f\nFri Dec  5 01:01:37 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c (b939c4b5a35ac424fbc52141e94182564d9ecd9c040231e62883ae1c1e1a880f)\nb939c4b5a35ac424fbc52141e94182564d9ecd9c040231e62883ae1c1e1a880f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:01:37 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:37.932 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[6f7b9776-712e-4211-9a02-2aa0b3d46eab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:01:37 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:37.932 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4389bc8-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:01:37 np0005546954 nova_compute[187160]: 2025-12-05 13:01:37.934 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:01:37 np0005546954 kernel: tapd4389bc8-20: left promiscuous mode
Dec  5 08:01:37 np0005546954 nova_compute[187160]: 2025-12-05 13:01:37.945 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:01:37 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:37.948 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[697f8be9-3acc-4e24-b351-7024731538fe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:01:37 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:37.960 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[da4a0c4c-55b9-4075-bbe3-efc2313978ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:01:37 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:37.961 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[42dc225d-9cce-484c-a13b-582f99eac190]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:01:37 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:37.982 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[228640a4-edfc-46f6-8835-5036f0870927]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 463421, 'reachable_time': 16178, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216013, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:01:37 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:37.985 104542 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 08:01:37 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:01:37.985 104542 DEBUG oslo.privsep.daemon [-] privsep: reply[3c9338ca-2246-4787-a672-59726637c11d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:01:37 np0005546954 systemd[1]: run-netns-ovnmeta\x2dd4389bc8\x2d2898\x2d48b0\x2d9741\x2d5183b54fe83c.mount: Deactivated successfully.
Dec  5 08:01:38 np0005546954 nova_compute[187160]: 2025-12-05 13:01:38.142 187164 DEBUG nova.compute.manager [req-1e4bc003-9a4a-4761-a404-3c4701db9a1a req-5b9bcf6f-1c83-4baf-8d8f-696e1a22da0c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Received event network-vif-plugged-254255b0-dc98-4210-b14e-8017dc43bccb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:01:38 np0005546954 nova_compute[187160]: 2025-12-05 13:01:38.143 187164 DEBUG oslo_concurrency.lockutils [req-1e4bc003-9a4a-4761-a404-3c4701db9a1a req-5b9bcf6f-1c83-4baf-8d8f-696e1a22da0c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "a9e7b53a-45dd-415c-9977-b5df73cde5f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:01:38 np0005546954 nova_compute[187160]: 2025-12-05 13:01:38.143 187164 DEBUG oslo_concurrency.lockutils [req-1e4bc003-9a4a-4761-a404-3c4701db9a1a req-5b9bcf6f-1c83-4baf-8d8f-696e1a22da0c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "a9e7b53a-45dd-415c-9977-b5df73cde5f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:01:38 np0005546954 nova_compute[187160]: 2025-12-05 13:01:38.143 187164 DEBUG oslo_concurrency.lockutils [req-1e4bc003-9a4a-4761-a404-3c4701db9a1a req-5b9bcf6f-1c83-4baf-8d8f-696e1a22da0c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "a9e7b53a-45dd-415c-9977-b5df73cde5f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:01:38 np0005546954 nova_compute[187160]: 2025-12-05 13:01:38.144 187164 DEBUG nova.compute.manager [req-1e4bc003-9a4a-4761-a404-3c4701db9a1a req-5b9bcf6f-1c83-4baf-8d8f-696e1a22da0c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] No waiting events found dispatching network-vif-plugged-254255b0-dc98-4210-b14e-8017dc43bccb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 08:01:38 np0005546954 nova_compute[187160]: 2025-12-05 13:01:38.144 187164 WARNING nova.compute.manager [req-1e4bc003-9a4a-4761-a404-3c4701db9a1a req-5b9bcf6f-1c83-4baf-8d8f-696e1a22da0c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Received unexpected event network-vif-plugged-254255b0-dc98-4210-b14e-8017dc43bccb for instance with vm_state deleted and task_state None.#033[00m
Dec  5 08:01:38 np0005546954 nova_compute[187160]: 2025-12-05 13:01:38.144 187164 DEBUG nova.compute.manager [req-1e4bc003-9a4a-4761-a404-3c4701db9a1a req-5b9bcf6f-1c83-4baf-8d8f-696e1a22da0c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Received event network-vif-deleted-254255b0-dc98-4210-b14e-8017dc43bccb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:01:38 np0005546954 nova_compute[187160]: 2025-12-05 13:01:38.693 187164 DEBUG nova.network.neutron [-] [instance: 125a08e9-c13a-4c0a-ad37-bda6cf4faf94] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 08:01:38 np0005546954 nova_compute[187160]: 2025-12-05 13:01:38.724 187164 INFO nova.compute.manager [-] [instance: 125a08e9-c13a-4c0a-ad37-bda6cf4faf94] Took 0.83 seconds to deallocate network for instance.#033[00m
Dec  5 08:01:38 np0005546954 nova_compute[187160]: 2025-12-05 13:01:38.761 187164 DEBUG oslo_concurrency.lockutils [None req-5b602aab-7a49-4ed7-af2e-112d0bc5b3e2 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:01:38 np0005546954 nova_compute[187160]: 2025-12-05 13:01:38.762 187164 DEBUG oslo_concurrency.lockutils [None req-5b602aab-7a49-4ed7-af2e-112d0bc5b3e2 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:01:38 np0005546954 nova_compute[187160]: 2025-12-05 13:01:38.767 187164 DEBUG oslo_concurrency.lockutils [None req-5b602aab-7a49-4ed7-af2e-112d0bc5b3e2 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:01:38 np0005546954 nova_compute[187160]: 2025-12-05 13:01:38.786 187164 INFO nova.scheduler.client.report [None req-5b602aab-7a49-4ed7-af2e-112d0bc5b3e2 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Deleted allocations for instance 125a08e9-c13a-4c0a-ad37-bda6cf4faf94#033[00m
Dec  5 08:01:38 np0005546954 nova_compute[187160]: 2025-12-05 13:01:38.845 187164 DEBUG oslo_concurrency.lockutils [None req-5b602aab-7a49-4ed7-af2e-112d0bc5b3e2 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "125a08e9-c13a-4c0a-ad37-bda6cf4faf94" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.315s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:01:40 np0005546954 nova_compute[187160]: 2025-12-05 13:01:40.235 187164 DEBUG nova.compute.manager [req-b10ee69a-3f04-4f49-993c-fc7dda4d5a6c req-01b4be7f-2e02-453b-8164-6221b3d5c3fd 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 125a08e9-c13a-4c0a-ad37-bda6cf4faf94] Received event network-vif-unplugged-26faa3c8-6110-4c71-bc52-e6408ab1ecee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:01:40 np0005546954 nova_compute[187160]: 2025-12-05 13:01:40.235 187164 DEBUG oslo_concurrency.lockutils [req-b10ee69a-3f04-4f49-993c-fc7dda4d5a6c req-01b4be7f-2e02-453b-8164-6221b3d5c3fd 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "125a08e9-c13a-4c0a-ad37-bda6cf4faf94-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:01:40 np0005546954 nova_compute[187160]: 2025-12-05 13:01:40.235 187164 DEBUG oslo_concurrency.lockutils [req-b10ee69a-3f04-4f49-993c-fc7dda4d5a6c req-01b4be7f-2e02-453b-8164-6221b3d5c3fd 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "125a08e9-c13a-4c0a-ad37-bda6cf4faf94-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:01:40 np0005546954 nova_compute[187160]: 2025-12-05 13:01:40.235 187164 DEBUG oslo_concurrency.lockutils [req-b10ee69a-3f04-4f49-993c-fc7dda4d5a6c req-01b4be7f-2e02-453b-8164-6221b3d5c3fd 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "125a08e9-c13a-4c0a-ad37-bda6cf4faf94-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:01:40 np0005546954 nova_compute[187160]: 2025-12-05 13:01:40.236 187164 DEBUG nova.compute.manager [req-b10ee69a-3f04-4f49-993c-fc7dda4d5a6c req-01b4be7f-2e02-453b-8164-6221b3d5c3fd 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 125a08e9-c13a-4c0a-ad37-bda6cf4faf94] No waiting events found dispatching network-vif-unplugged-26faa3c8-6110-4c71-bc52-e6408ab1ecee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 08:01:40 np0005546954 nova_compute[187160]: 2025-12-05 13:01:40.236 187164 WARNING nova.compute.manager [req-b10ee69a-3f04-4f49-993c-fc7dda4d5a6c req-01b4be7f-2e02-453b-8164-6221b3d5c3fd 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 125a08e9-c13a-4c0a-ad37-bda6cf4faf94] Received unexpected event network-vif-unplugged-26faa3c8-6110-4c71-bc52-e6408ab1ecee for instance with vm_state deleted and task_state None.#033[00m
Dec  5 08:01:40 np0005546954 nova_compute[187160]: 2025-12-05 13:01:40.236 187164 DEBUG nova.compute.manager [req-b10ee69a-3f04-4f49-993c-fc7dda4d5a6c req-01b4be7f-2e02-453b-8164-6221b3d5c3fd 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 125a08e9-c13a-4c0a-ad37-bda6cf4faf94] Received event network-vif-plugged-26faa3c8-6110-4c71-bc52-e6408ab1ecee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:01:40 np0005546954 nova_compute[187160]: 2025-12-05 13:01:40.237 187164 DEBUG oslo_concurrency.lockutils [req-b10ee69a-3f04-4f49-993c-fc7dda4d5a6c req-01b4be7f-2e02-453b-8164-6221b3d5c3fd 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "125a08e9-c13a-4c0a-ad37-bda6cf4faf94-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:01:40 np0005546954 nova_compute[187160]: 2025-12-05 13:01:40.237 187164 DEBUG oslo_concurrency.lockutils [req-b10ee69a-3f04-4f49-993c-fc7dda4d5a6c req-01b4be7f-2e02-453b-8164-6221b3d5c3fd 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "125a08e9-c13a-4c0a-ad37-bda6cf4faf94-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:01:40 np0005546954 nova_compute[187160]: 2025-12-05 13:01:40.237 187164 DEBUG oslo_concurrency.lockutils [req-b10ee69a-3f04-4f49-993c-fc7dda4d5a6c req-01b4be7f-2e02-453b-8164-6221b3d5c3fd 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "125a08e9-c13a-4c0a-ad37-bda6cf4faf94-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:01:40 np0005546954 nova_compute[187160]: 2025-12-05 13:01:40.237 187164 DEBUG nova.compute.manager [req-b10ee69a-3f04-4f49-993c-fc7dda4d5a6c req-01b4be7f-2e02-453b-8164-6221b3d5c3fd 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 125a08e9-c13a-4c0a-ad37-bda6cf4faf94] No waiting events found dispatching network-vif-plugged-26faa3c8-6110-4c71-bc52-e6408ab1ecee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 08:01:40 np0005546954 nova_compute[187160]: 2025-12-05 13:01:40.237 187164 WARNING nova.compute.manager [req-b10ee69a-3f04-4f49-993c-fc7dda4d5a6c req-01b4be7f-2e02-453b-8164-6221b3d5c3fd 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 125a08e9-c13a-4c0a-ad37-bda6cf4faf94] Received unexpected event network-vif-plugged-26faa3c8-6110-4c71-bc52-e6408ab1ecee for instance with vm_state deleted and task_state None.#033[00m
Dec  5 08:01:40 np0005546954 nova_compute[187160]: 2025-12-05 13:01:40.237 187164 DEBUG nova.compute.manager [req-b10ee69a-3f04-4f49-993c-fc7dda4d5a6c req-01b4be7f-2e02-453b-8164-6221b3d5c3fd 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 125a08e9-c13a-4c0a-ad37-bda6cf4faf94] Received event network-vif-deleted-26faa3c8-6110-4c71-bc52-e6408ab1ecee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:01:40 np0005546954 nova_compute[187160]: 2025-12-05 13:01:40.842 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:01:42 np0005546954 podman[216015]: 2025-12-05 13:01:42.547446578 +0000 UTC m=+0.056205543 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 08:01:42 np0005546954 podman[216014]: 2025-12-05 13:01:42.579164947 +0000 UTC m=+0.088169120 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec  5 08:01:42 np0005546954 nova_compute[187160]: 2025-12-05 13:01:42.832 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:01:45 np0005546954 nova_compute[187160]: 2025-12-05 13:01:45.844 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:01:47 np0005546954 nova_compute[187160]: 2025-12-05 13:01:47.835 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:01:50 np0005546954 openstack_network_exporter[199661]: ERROR   13:01:50 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:01:50 np0005546954 openstack_network_exporter[199661]: ERROR   13:01:50 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:01:50 np0005546954 openstack_network_exporter[199661]: ERROR   13:01:50 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:01:50 np0005546954 openstack_network_exporter[199661]: ERROR   13:01:50 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:01:50 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:01:50 np0005546954 openstack_network_exporter[199661]: ERROR   13:01:50 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:01:50 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:01:50 np0005546954 nova_compute[187160]: 2025-12-05 13:01:50.609 187164 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764939695.6079216, a9e7b53a-45dd-415c-9977-b5df73cde5f2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 08:01:50 np0005546954 nova_compute[187160]: 2025-12-05 13:01:50.609 187164 INFO nova.compute.manager [-] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] VM Stopped (Lifecycle Event)#033[00m
Dec  5 08:01:50 np0005546954 nova_compute[187160]: 2025-12-05 13:01:50.631 187164 DEBUG nova.compute.manager [None req-3c9fe876-b45b-42b6-ae32-2170afb2f183 - - - - - -] [instance: a9e7b53a-45dd-415c-9977-b5df73cde5f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 08:01:50 np0005546954 nova_compute[187160]: 2025-12-05 13:01:50.846 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:01:52 np0005546954 nova_compute[187160]: 2025-12-05 13:01:52.809 187164 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764939697.8083396, 125a08e9-c13a-4c0a-ad37-bda6cf4faf94 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 08:01:52 np0005546954 nova_compute[187160]: 2025-12-05 13:01:52.809 187164 INFO nova.compute.manager [-] [instance: 125a08e9-c13a-4c0a-ad37-bda6cf4faf94] VM Stopped (Lifecycle Event)#033[00m
Dec  5 08:01:52 np0005546954 nova_compute[187160]: 2025-12-05 13:01:52.836 187164 DEBUG nova.compute.manager [None req-8d7b4fe4-a58a-4559-9930-fe92b57565c0 - - - - - -] [instance: 125a08e9-c13a-4c0a-ad37-bda6cf4faf94] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 08:01:52 np0005546954 nova_compute[187160]: 2025-12-05 13:01:52.838 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:01:55 np0005546954 podman[216067]: 2025-12-05 13:01:55.541938769 +0000 UTC m=+0.056200421 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=multipathd, io.buildah.version=1.41.3)
Dec  5 08:01:55 np0005546954 podman[216066]: 2025-12-05 13:01:55.5419789 +0000 UTC m=+0.056919842 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, distribution-scope=public, io.openshift.tags=minimal rhel9, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_id=edpm, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, name=ubi9-minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec  5 08:01:55 np0005546954 nova_compute[187160]: 2025-12-05 13:01:55.886 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:01:57 np0005546954 nova_compute[187160]: 2025-12-05 13:01:57.841 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:02:00 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:00.877 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2a:56:46', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '32:90:88:ab:74:32'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 08:02:00 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:00.877 104428 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 08:02:00 np0005546954 nova_compute[187160]: 2025-12-05 13:02:00.878 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:02:00 np0005546954 nova_compute[187160]: 2025-12-05 13:02:00.887 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:02:02 np0005546954 nova_compute[187160]: 2025-12-05 13:02:02.844 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:02:05 np0005546954 podman[197513]: time="2025-12-05T13:02:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:02:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:02:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 08:02:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:02:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2598 "" "Go-http-client/1.1"
Dec  5 08:02:05 np0005546954 nova_compute[187160]: 2025-12-05 13:02:05.889 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:02:07 np0005546954 nova_compute[187160]: 2025-12-05 13:02:07.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:02:07 np0005546954 nova_compute[187160]: 2025-12-05 13:02:07.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 08:02:07 np0005546954 nova_compute[187160]: 2025-12-05 13:02:07.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 08:02:07 np0005546954 nova_compute[187160]: 2025-12-05 13:02:07.111 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 08:02:07 np0005546954 nova_compute[187160]: 2025-12-05 13:02:07.112 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:02:07 np0005546954 nova_compute[187160]: 2025-12-05 13:02:07.848 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:02:08 np0005546954 podman[216107]: 2025-12-05 13:02:08.552855639 +0000 UTC m=+0.065599202 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent)
Dec  5 08:02:09 np0005546954 nova_compute[187160]: 2025-12-05 13:02:09.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:02:10 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:10.879 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f9f74c-08f9-451f-9678-93bb9e8fa2fe, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:02:10 np0005546954 nova_compute[187160]: 2025-12-05 13:02:10.891 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:02:12 np0005546954 nova_compute[187160]: 2025-12-05 13:02:12.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:02:12 np0005546954 nova_compute[187160]: 2025-12-05 13:02:12.851 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:02:13 np0005546954 nova_compute[187160]: 2025-12-05 13:02:13.034 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:02:13 np0005546954 nova_compute[187160]: 2025-12-05 13:02:13.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:02:13 np0005546954 nova_compute[187160]: 2025-12-05 13:02:13.323 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:02:13 np0005546954 nova_compute[187160]: 2025-12-05 13:02:13.323 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:02:13 np0005546954 nova_compute[187160]: 2025-12-05 13:02:13.324 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:02:13 np0005546954 nova_compute[187160]: 2025-12-05 13:02:13.324 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 08:02:13 np0005546954 podman[216130]: 2025-12-05 13:02:13.455191818 +0000 UTC m=+0.060672079 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec  5 08:02:13 np0005546954 podman[216129]: 2025-12-05 13:02:13.492001048 +0000 UTC m=+0.105389924 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  5 08:02:13 np0005546954 nova_compute[187160]: 2025-12-05 13:02:13.527 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 08:02:13 np0005546954 nova_compute[187160]: 2025-12-05 13:02:13.528 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5869MB free_disk=73.3356704711914GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 08:02:13 np0005546954 nova_compute[187160]: 2025-12-05 13:02:13.528 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:02:13 np0005546954 nova_compute[187160]: 2025-12-05 13:02:13.529 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:02:13 np0005546954 nova_compute[187160]: 2025-12-05 13:02:13.616 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 08:02:13 np0005546954 nova_compute[187160]: 2025-12-05 13:02:13.617 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 08:02:13 np0005546954 nova_compute[187160]: 2025-12-05 13:02:13.651 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 08:02:13 np0005546954 nova_compute[187160]: 2025-12-05 13:02:13.664 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 08:02:13 np0005546954 nova_compute[187160]: 2025-12-05 13:02:13.690 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 08:02:13 np0005546954 nova_compute[187160]: 2025-12-05 13:02:13.691 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:02:14 np0005546954 nova_compute[187160]: 2025-12-05 13:02:14.692 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:02:15 np0005546954 nova_compute[187160]: 2025-12-05 13:02:15.937 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:02:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:16.964 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:02:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:16.964 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:02:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:16.964 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:02:17 np0005546954 nova_compute[187160]: 2025-12-05 13:02:17.853 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:02:18 np0005546954 nova_compute[187160]: 2025-12-05 13:02:18.422 187164 DEBUG oslo_concurrency.lockutils [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "4fe7206a-4625-4e1b-ad62-a53794dfe8f7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:02:18 np0005546954 nova_compute[187160]: 2025-12-05 13:02:18.423 187164 DEBUG oslo_concurrency.lockutils [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "4fe7206a-4625-4e1b-ad62-a53794dfe8f7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:02:18 np0005546954 nova_compute[187160]: 2025-12-05 13:02:18.448 187164 DEBUG nova.compute.manager [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 08:02:18 np0005546954 nova_compute[187160]: 2025-12-05 13:02:18.531 187164 DEBUG oslo_concurrency.lockutils [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:02:18 np0005546954 nova_compute[187160]: 2025-12-05 13:02:18.531 187164 DEBUG oslo_concurrency.lockutils [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:02:18 np0005546954 nova_compute[187160]: 2025-12-05 13:02:18.541 187164 DEBUG nova.virt.hardware [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 08:02:18 np0005546954 nova_compute[187160]: 2025-12-05 13:02:18.542 187164 INFO nova.compute.claims [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Claim successful on node compute-1.ctlplane.example.com#033[00m
Dec  5 08:02:18 np0005546954 nova_compute[187160]: 2025-12-05 13:02:18.649 187164 DEBUG nova.compute.provider_tree [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 08:02:18 np0005546954 nova_compute[187160]: 2025-12-05 13:02:18.665 187164 DEBUG nova.scheduler.client.report [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 08:02:18 np0005546954 nova_compute[187160]: 2025-12-05 13:02:18.691 187164 DEBUG oslo_concurrency.lockutils [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:02:18 np0005546954 nova_compute[187160]: 2025-12-05 13:02:18.692 187164 DEBUG nova.compute.manager [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 08:02:18 np0005546954 nova_compute[187160]: 2025-12-05 13:02:18.755 187164 DEBUG nova.compute.manager [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 08:02:18 np0005546954 nova_compute[187160]: 2025-12-05 13:02:18.755 187164 DEBUG nova.network.neutron [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 08:02:18 np0005546954 nova_compute[187160]: 2025-12-05 13:02:18.773 187164 INFO nova.virt.libvirt.driver [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 08:02:18 np0005546954 nova_compute[187160]: 2025-12-05 13:02:18.791 187164 DEBUG nova.compute.manager [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 08:02:18 np0005546954 nova_compute[187160]: 2025-12-05 13:02:18.876 187164 DEBUG nova.compute.manager [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 08:02:18 np0005546954 nova_compute[187160]: 2025-12-05 13:02:18.878 187164 DEBUG nova.virt.libvirt.driver [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 08:02:18 np0005546954 nova_compute[187160]: 2025-12-05 13:02:18.878 187164 INFO nova.virt.libvirt.driver [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Creating image(s)#033[00m
Dec  5 08:02:18 np0005546954 nova_compute[187160]: 2025-12-05 13:02:18.878 187164 DEBUG oslo_concurrency.lockutils [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "/var/lib/nova/instances/4fe7206a-4625-4e1b-ad62-a53794dfe8f7/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:02:18 np0005546954 nova_compute[187160]: 2025-12-05 13:02:18.879 187164 DEBUG oslo_concurrency.lockutils [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "/var/lib/nova/instances/4fe7206a-4625-4e1b-ad62-a53794dfe8f7/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:02:18 np0005546954 nova_compute[187160]: 2025-12-05 13:02:18.879 187164 DEBUG oslo_concurrency.lockutils [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "/var/lib/nova/instances/4fe7206a-4625-4e1b-ad62-a53794dfe8f7/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:02:18 np0005546954 nova_compute[187160]: 2025-12-05 13:02:18.891 187164 DEBUG oslo_concurrency.processutils [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:02:18 np0005546954 nova_compute[187160]: 2025-12-05 13:02:18.956 187164 DEBUG oslo_concurrency.processutils [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:02:18 np0005546954 nova_compute[187160]: 2025-12-05 13:02:18.958 187164 DEBUG oslo_concurrency.lockutils [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:02:18 np0005546954 nova_compute[187160]: 2025-12-05 13:02:18.958 187164 DEBUG oslo_concurrency.lockutils [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:02:18 np0005546954 nova_compute[187160]: 2025-12-05 13:02:18.969 187164 DEBUG oslo_concurrency.processutils [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:02:19 np0005546954 nova_compute[187160]: 2025-12-05 13:02:19.024 187164 DEBUG oslo_concurrency.processutils [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:02:19 np0005546954 nova_compute[187160]: 2025-12-05 13:02:19.025 187164 DEBUG oslo_concurrency.processutils [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/4fe7206a-4625-4e1b-ad62-a53794dfe8f7/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:02:19 np0005546954 nova_compute[187160]: 2025-12-05 13:02:19.044 187164 DEBUG nova.policy [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0ae0bb20ac8b4be99eb1abddc7310436', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e6ae0d0dcde04b85b6dae45560cca988', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 08:02:19 np0005546954 nova_compute[187160]: 2025-12-05 13:02:19.062 187164 DEBUG oslo_concurrency.processutils [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/4fe7206a-4625-4e1b-ad62-a53794dfe8f7/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:02:19 np0005546954 nova_compute[187160]: 2025-12-05 13:02:19.062 187164 DEBUG oslo_concurrency.lockutils [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:02:19 np0005546954 nova_compute[187160]: 2025-12-05 13:02:19.063 187164 DEBUG oslo_concurrency.processutils [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:02:19 np0005546954 nova_compute[187160]: 2025-12-05 13:02:19.131 187164 DEBUG oslo_concurrency.processutils [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:02:19 np0005546954 nova_compute[187160]: 2025-12-05 13:02:19.132 187164 DEBUG nova.virt.disk.api [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Checking if we can resize image /var/lib/nova/instances/4fe7206a-4625-4e1b-ad62-a53794dfe8f7/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 08:02:19 np0005546954 nova_compute[187160]: 2025-12-05 13:02:19.132 187164 DEBUG oslo_concurrency.processutils [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4fe7206a-4625-4e1b-ad62-a53794dfe8f7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:02:19 np0005546954 nova_compute[187160]: 2025-12-05 13:02:19.185 187164 DEBUG oslo_concurrency.processutils [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4fe7206a-4625-4e1b-ad62-a53794dfe8f7/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:02:19 np0005546954 nova_compute[187160]: 2025-12-05 13:02:19.186 187164 DEBUG nova.virt.disk.api [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Cannot resize image /var/lib/nova/instances/4fe7206a-4625-4e1b-ad62-a53794dfe8f7/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 08:02:19 np0005546954 nova_compute[187160]: 2025-12-05 13:02:19.186 187164 DEBUG nova.objects.instance [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lazy-loading 'migration_context' on Instance uuid 4fe7206a-4625-4e1b-ad62-a53794dfe8f7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 08:02:19 np0005546954 nova_compute[187160]: 2025-12-05 13:02:19.205 187164 DEBUG nova.virt.libvirt.driver [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 08:02:19 np0005546954 nova_compute[187160]: 2025-12-05 13:02:19.205 187164 DEBUG nova.virt.libvirt.driver [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Ensure instance console log exists: /var/lib/nova/instances/4fe7206a-4625-4e1b-ad62-a53794dfe8f7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 08:02:19 np0005546954 nova_compute[187160]: 2025-12-05 13:02:19.206 187164 DEBUG oslo_concurrency.lockutils [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:02:19 np0005546954 nova_compute[187160]: 2025-12-05 13:02:19.206 187164 DEBUG oslo_concurrency.lockutils [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:02:19 np0005546954 nova_compute[187160]: 2025-12-05 13:02:19.206 187164 DEBUG oslo_concurrency.lockutils [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:02:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:02:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:02:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:02:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:02:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:02:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:02:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:02:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:02:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:02:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:02:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:02:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:02:19 np0005546954 nova_compute[187160]: 2025-12-05 13:02:19.664 187164 DEBUG nova.network.neutron [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Successfully created port: 6d0f8de4-2f58-4be9-973b-d7c01431f90c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 08:02:20 np0005546954 nova_compute[187160]: 2025-12-05 13:02:20.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:02:20 np0005546954 nova_compute[187160]: 2025-12-05 13:02:20.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:02:20 np0005546954 nova_compute[187160]: 2025-12-05 13:02:20.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 08:02:20 np0005546954 nova_compute[187160]: 2025-12-05 13:02:20.939 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:02:22 np0005546954 nova_compute[187160]: 2025-12-05 13:02:22.871 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:02:22 np0005546954 nova_compute[187160]: 2025-12-05 13:02:22.931 187164 DEBUG nova.network.neutron [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Successfully updated port: 6d0f8de4-2f58-4be9-973b-d7c01431f90c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 08:02:22 np0005546954 nova_compute[187160]: 2025-12-05 13:02:22.949 187164 DEBUG oslo_concurrency.lockutils [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "refresh_cache-4fe7206a-4625-4e1b-ad62-a53794dfe8f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 08:02:22 np0005546954 nova_compute[187160]: 2025-12-05 13:02:22.950 187164 DEBUG oslo_concurrency.lockutils [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquired lock "refresh_cache-4fe7206a-4625-4e1b-ad62-a53794dfe8f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 08:02:22 np0005546954 nova_compute[187160]: 2025-12-05 13:02:22.950 187164 DEBUG nova.network.neutron [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 08:02:23 np0005546954 nova_compute[187160]: 2025-12-05 13:02:23.034 187164 DEBUG nova.compute.manager [req-99198045-c720-420e-9a99-f791703707e1 req-44b889e7-34d0-47a6-81f0-87e8bd7008e0 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Received event network-changed-6d0f8de4-2f58-4be9-973b-d7c01431f90c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:02:23 np0005546954 nova_compute[187160]: 2025-12-05 13:02:23.034 187164 DEBUG nova.compute.manager [req-99198045-c720-420e-9a99-f791703707e1 req-44b889e7-34d0-47a6-81f0-87e8bd7008e0 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Refreshing instance network info cache due to event network-changed-6d0f8de4-2f58-4be9-973b-d7c01431f90c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 08:02:23 np0005546954 nova_compute[187160]: 2025-12-05 13:02:23.035 187164 DEBUG oslo_concurrency.lockutils [req-99198045-c720-420e-9a99-f791703707e1 req-44b889e7-34d0-47a6-81f0-87e8bd7008e0 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "refresh_cache-4fe7206a-4625-4e1b-ad62-a53794dfe8f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 08:02:23 np0005546954 nova_compute[187160]: 2025-12-05 13:02:23.094 187164 DEBUG nova.network.neutron [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.356 187164 DEBUG nova.network.neutron [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Updating instance_info_cache with network_info: [{"id": "6d0f8de4-2f58-4be9-973b-d7c01431f90c", "address": "fa:16:3e:3e:db:2d", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d0f8de4-2f", "ovs_interfaceid": "6d0f8de4-2f58-4be9-973b-d7c01431f90c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.379 187164 DEBUG oslo_concurrency.lockutils [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Releasing lock "refresh_cache-4fe7206a-4625-4e1b-ad62-a53794dfe8f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.379 187164 DEBUG nova.compute.manager [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Instance network_info: |[{"id": "6d0f8de4-2f58-4be9-973b-d7c01431f90c", "address": "fa:16:3e:3e:db:2d", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d0f8de4-2f", "ovs_interfaceid": "6d0f8de4-2f58-4be9-973b-d7c01431f90c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.380 187164 DEBUG oslo_concurrency.lockutils [req-99198045-c720-420e-9a99-f791703707e1 req-44b889e7-34d0-47a6-81f0-87e8bd7008e0 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquired lock "refresh_cache-4fe7206a-4625-4e1b-ad62-a53794dfe8f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.380 187164 DEBUG nova.network.neutron [req-99198045-c720-420e-9a99-f791703707e1 req-44b889e7-34d0-47a6-81f0-87e8bd7008e0 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Refreshing network info cache for port 6d0f8de4-2f58-4be9-973b-d7c01431f90c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.383 187164 DEBUG nova.virt.libvirt.driver [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Start _get_guest_xml network_info=[{"id": "6d0f8de4-2f58-4be9-973b-d7c01431f90c", "address": "fa:16:3e:3e:db:2d", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d0f8de4-2f", "ovs_interfaceid": "6d0f8de4-2f58-4be9-973b-d7c01431f90c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T12:39:17Z,direct_url=<?>,disk_format='qcow2',id=f4c3125a-6fd0-40bb-aa00-a7e736ee853d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='83916c53de6f404f91206339303e1b23',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T12:39:21Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'size': 0, 'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'encrypted': False, 'image_id': 'f4c3125a-6fd0-40bb-aa00-a7e736ee853d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.387 187164 WARNING nova.virt.libvirt.driver [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.393 187164 DEBUG nova.virt.libvirt.host [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.394 187164 DEBUG nova.virt.libvirt.host [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.402 187164 DEBUG nova.virt.libvirt.host [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.403 187164 DEBUG nova.virt.libvirt.host [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.404 187164 DEBUG nova.virt.libvirt.driver [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.404 187164 DEBUG nova.virt.hardware [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T12:39:15Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b4ea63be-97f8-4a48-b000-66321c4ddb27',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T12:39:17Z,direct_url=<?>,disk_format='qcow2',id=f4c3125a-6fd0-40bb-aa00-a7e736ee853d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='83916c53de6f404f91206339303e1b23',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T12:39:21Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.405 187164 DEBUG nova.virt.hardware [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.405 187164 DEBUG nova.virt.hardware [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.405 187164 DEBUG nova.virt.hardware [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.405 187164 DEBUG nova.virt.hardware [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.406 187164 DEBUG nova.virt.hardware [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.406 187164 DEBUG nova.virt.hardware [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.406 187164 DEBUG nova.virt.hardware [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.407 187164 DEBUG nova.virt.hardware [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.407 187164 DEBUG nova.virt.hardware [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.407 187164 DEBUG nova.virt.hardware [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.411 187164 DEBUG nova.virt.libvirt.vif [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T13:02:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-938314815',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-938314815',id=22,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e6ae0d0dcde04b85b6dae45560cca988',ramdisk_id='',reservation_id='r-bftthp2a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-192029678',owner_user_name='tempest-TestExecuteStrategies-192029678-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T13:02:18Z,user_data=None,user_id='0ae0bb20ac8b4be99eb1abddc7310436',uuid=4fe7206a-4625-4e1b-ad62-a53794dfe8f7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6d0f8de4-2f58-4be9-973b-d7c01431f90c", "address": "fa:16:3e:3e:db:2d", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d0f8de4-2f", "ovs_interfaceid": "6d0f8de4-2f58-4be9-973b-d7c01431f90c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.411 187164 DEBUG nova.network.os_vif_util [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converting VIF {"id": "6d0f8de4-2f58-4be9-973b-d7c01431f90c", "address": "fa:16:3e:3e:db:2d", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d0f8de4-2f", "ovs_interfaceid": "6d0f8de4-2f58-4be9-973b-d7c01431f90c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.412 187164 DEBUG nova.network.os_vif_util [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3e:db:2d,bridge_name='br-int',has_traffic_filtering=True,id=6d0f8de4-2f58-4be9-973b-d7c01431f90c,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d0f8de4-2f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.413 187164 DEBUG nova.objects.instance [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4fe7206a-4625-4e1b-ad62-a53794dfe8f7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.464 187164 DEBUG nova.virt.libvirt.driver [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] End _get_guest_xml xml=<domain type="kvm">
Dec  5 08:02:24 np0005546954 nova_compute[187160]:  <uuid>4fe7206a-4625-4e1b-ad62-a53794dfe8f7</uuid>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:  <name>instance-00000016</name>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:  <memory>131072</memory>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:  <vcpu>1</vcpu>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:  <metadata>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 08:02:24 np0005546954 nova_compute[187160]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:      <nova:name>tempest-TestExecuteStrategies-server-938314815</nova:name>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:      <nova:creationTime>2025-12-05 13:02:24</nova:creationTime>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:      <nova:flavor name="m1.nano">
Dec  5 08:02:24 np0005546954 nova_compute[187160]:        <nova:memory>128</nova:memory>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:        <nova:disk>1</nova:disk>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:        <nova:swap>0</nova:swap>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:        <nova:vcpus>1</nova:vcpus>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:      </nova:flavor>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:      <nova:owner>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:        <nova:user uuid="0ae0bb20ac8b4be99eb1abddc7310436">tempest-TestExecuteStrategies-192029678-project-member</nova:user>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:        <nova:project uuid="e6ae0d0dcde04b85b6dae45560cca988">tempest-TestExecuteStrategies-192029678</nova:project>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:      </nova:owner>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:      <nova:root type="image" uuid="f4c3125a-6fd0-40bb-aa00-a7e736ee853d"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:      <nova:ports>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:        <nova:port uuid="6d0f8de4-2f58-4be9-973b-d7c01431f90c">
Dec  5 08:02:24 np0005546954 nova_compute[187160]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:        </nova:port>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:      </nova:ports>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    </nova:instance>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:  </metadata>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:  <sysinfo type="smbios">
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <system>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:      <entry name="manufacturer">RDO</entry>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:      <entry name="product">OpenStack Compute</entry>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:      <entry name="serial">4fe7206a-4625-4e1b-ad62-a53794dfe8f7</entry>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:      <entry name="uuid">4fe7206a-4625-4e1b-ad62-a53794dfe8f7</entry>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:      <entry name="family">Virtual Machine</entry>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    </system>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:  </sysinfo>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:  <os>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <boot dev="hd"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <smbios mode="sysinfo"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:  </os>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:  <features>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <acpi/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <apic/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <vmcoreinfo/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:  </features>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:  <clock offset="utc">
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <timer name="hpet" present="no"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:  </clock>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:  <cpu mode="custom" match="exact">
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <model>Nehalem</model>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:  </cpu>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:  <devices>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <disk type="file" device="disk">
Dec  5 08:02:24 np0005546954 nova_compute[187160]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:      <source file="/var/lib/nova/instances/4fe7206a-4625-4e1b-ad62-a53794dfe8f7/disk"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:      <target dev="vda" bus="virtio"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    </disk>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <disk type="file" device="cdrom">
Dec  5 08:02:24 np0005546954 nova_compute[187160]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:      <source file="/var/lib/nova/instances/4fe7206a-4625-4e1b-ad62-a53794dfe8f7/disk.config"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:      <target dev="sda" bus="sata"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    </disk>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <interface type="ethernet">
Dec  5 08:02:24 np0005546954 nova_compute[187160]:      <mac address="fa:16:3e:3e:db:2d"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:      <model type="virtio"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:      <mtu size="1442"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:      <target dev="tap6d0f8de4-2f"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    </interface>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <serial type="pty">
Dec  5 08:02:24 np0005546954 nova_compute[187160]:      <log file="/var/lib/nova/instances/4fe7206a-4625-4e1b-ad62-a53794dfe8f7/console.log" append="off"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    </serial>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <video>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:      <model type="virtio"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    </video>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <input type="tablet" bus="usb"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <rng model="virtio">
Dec  5 08:02:24 np0005546954 nova_compute[187160]:      <backend model="random">/dev/urandom</backend>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    </rng>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <controller type="usb" index="0"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    <memballoon model="virtio">
Dec  5 08:02:24 np0005546954 nova_compute[187160]:      <stats period="10"/>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:    </memballoon>
Dec  5 08:02:24 np0005546954 nova_compute[187160]:  </devices>
Dec  5 08:02:24 np0005546954 nova_compute[187160]: </domain>
Dec  5 08:02:24 np0005546954 nova_compute[187160]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.465 187164 DEBUG nova.compute.manager [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Preparing to wait for external event network-vif-plugged-6d0f8de4-2f58-4be9-973b-d7c01431f90c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.466 187164 DEBUG oslo_concurrency.lockutils [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "4fe7206a-4625-4e1b-ad62-a53794dfe8f7-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.466 187164 DEBUG oslo_concurrency.lockutils [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "4fe7206a-4625-4e1b-ad62-a53794dfe8f7-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.466 187164 DEBUG oslo_concurrency.lockutils [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "4fe7206a-4625-4e1b-ad62-a53794dfe8f7-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.467 187164 DEBUG nova.virt.libvirt.vif [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T13:02:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-938314815',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-938314815',id=22,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e6ae0d0dcde04b85b6dae45560cca988',ramdisk_id='',reservation_id='r-bftthp2a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-192029678',owner_user_name='tempest-TestExecuteStrategies-192029678-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T13:02:18Z,user_data=None,user_id='0ae0bb20ac8b4be99eb1abddc7310436',uuid=4fe7206a-4625-4e1b-ad62-a53794dfe8f7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6d0f8de4-2f58-4be9-973b-d7c01431f90c", "address": "fa:16:3e:3e:db:2d", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d0f8de4-2f", "ovs_interfaceid": "6d0f8de4-2f58-4be9-973b-d7c01431f90c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.467 187164 DEBUG nova.network.os_vif_util [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converting VIF {"id": "6d0f8de4-2f58-4be9-973b-d7c01431f90c", "address": "fa:16:3e:3e:db:2d", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d0f8de4-2f", "ovs_interfaceid": "6d0f8de4-2f58-4be9-973b-d7c01431f90c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.468 187164 DEBUG nova.network.os_vif_util [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3e:db:2d,bridge_name='br-int',has_traffic_filtering=True,id=6d0f8de4-2f58-4be9-973b-d7c01431f90c,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d0f8de4-2f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.469 187164 DEBUG os_vif [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3e:db:2d,bridge_name='br-int',has_traffic_filtering=True,id=6d0f8de4-2f58-4be9-973b-d7c01431f90c,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d0f8de4-2f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.470 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.470 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.471 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.473 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.473 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6d0f8de4-2f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.474 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6d0f8de4-2f, col_values=(('external_ids', {'iface-id': '6d0f8de4-2f58-4be9-973b-d7c01431f90c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3e:db:2d', 'vm-uuid': '4fe7206a-4625-4e1b-ad62-a53794dfe8f7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.476 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:02:24 np0005546954 NetworkManager[55665]: <info>  [1764939744.4771] manager: (tap6d0f8de4-2f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/81)
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.478 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.483 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.484 187164 INFO os_vif [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3e:db:2d,bridge_name='br-int',has_traffic_filtering=True,id=6d0f8de4-2f58-4be9-973b-d7c01431f90c,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d0f8de4-2f')#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.529 187164 DEBUG nova.virt.libvirt.driver [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.529 187164 DEBUG nova.virt.libvirt.driver [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.529 187164 DEBUG nova.virt.libvirt.driver [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] No VIF found with MAC fa:16:3e:3e:db:2d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 08:02:24 np0005546954 nova_compute[187160]: 2025-12-05 13:02:24.530 187164 INFO nova.virt.libvirt.driver [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Using config drive#033[00m
Dec  5 08:02:25 np0005546954 nova_compute[187160]: 2025-12-05 13:02:25.000 187164 INFO nova.virt.libvirt.driver [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Creating config drive at /var/lib/nova/instances/4fe7206a-4625-4e1b-ad62-a53794dfe8f7/disk.config#033[00m
Dec  5 08:02:25 np0005546954 nova_compute[187160]: 2025-12-05 13:02:25.006 187164 DEBUG oslo_concurrency.processutils [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4fe7206a-4625-4e1b-ad62-a53794dfe8f7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpn_jd8h6d execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:02:25 np0005546954 nova_compute[187160]: 2025-12-05 13:02:25.133 187164 DEBUG oslo_concurrency.processutils [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4fe7206a-4625-4e1b-ad62-a53794dfe8f7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpn_jd8h6d" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:02:25 np0005546954 kernel: tap6d0f8de4-2f: entered promiscuous mode
Dec  5 08:02:25 np0005546954 NetworkManager[55665]: <info>  [1764939745.1915] manager: (tap6d0f8de4-2f): new Tun device (/org/freedesktop/NetworkManager/Devices/82)
Dec  5 08:02:25 np0005546954 nova_compute[187160]: 2025-12-05 13:02:25.193 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:02:25 np0005546954 ovn_controller[95566]: 2025-12-05T13:02:25Z|00206|binding|INFO|Claiming lport 6d0f8de4-2f58-4be9-973b-d7c01431f90c for this chassis.
Dec  5 08:02:25 np0005546954 ovn_controller[95566]: 2025-12-05T13:02:25Z|00207|binding|INFO|6d0f8de4-2f58-4be9-973b-d7c01431f90c: Claiming fa:16:3e:3e:db:2d 10.100.0.12
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:25.202 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3e:db:2d 10.100.0.12'], port_security=['fa:16:3e:3e:db:2d 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '4fe7206a-4625-4e1b-ad62-a53794dfe8f7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4389bc8-2898-48b0-9741-5183b54fe83c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e6ae0d0dcde04b85b6dae45560cca988', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9ea68f98-ae7c-4c35-bc5a-7c1a27f7e5f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb60c317-acba-4c06-b29b-f7c6c7a5660a, chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=6d0f8de4-2f58-4be9-973b-d7c01431f90c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:25.204 104428 INFO neutron.agent.ovn.metadata.agent [-] Port 6d0f8de4-2f58-4be9-973b-d7c01431f90c in datapath d4389bc8-2898-48b0-9741-5183b54fe83c bound to our chassis#033[00m
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:25.205 104428 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d4389bc8-2898-48b0-9741-5183b54fe83c#033[00m
Dec  5 08:02:25 np0005546954 ovn_controller[95566]: 2025-12-05T13:02:25Z|00208|binding|INFO|Setting lport 6d0f8de4-2f58-4be9-973b-d7c01431f90c ovn-installed in OVS
Dec  5 08:02:25 np0005546954 ovn_controller[95566]: 2025-12-05T13:02:25Z|00209|binding|INFO|Setting lport 6d0f8de4-2f58-4be9-973b-d7c01431f90c up in Southbound
Dec  5 08:02:25 np0005546954 nova_compute[187160]: 2025-12-05 13:02:25.208 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:02:25 np0005546954 nova_compute[187160]: 2025-12-05 13:02:25.209 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:25.216 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[4a744204-1075-4ec8-b901-3094d7f31300]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:25.217 104428 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd4389bc8-21 in ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 08:02:25 np0005546954 systemd-udevd[216210]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:25.219 208690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd4389bc8-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:25.219 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[90d6a64e-137c-4443-9a62-fd70d73bc6b8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:25.220 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[4282cf3a-df68-4512-8f6c-069c3029cf88]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:02:25 np0005546954 systemd-machined[153497]: New machine qemu-20-instance-00000016.
Dec  5 08:02:25 np0005546954 NetworkManager[55665]: <info>  [1764939745.2301] device (tap6d0f8de4-2f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 08:02:25 np0005546954 NetworkManager[55665]: <info>  [1764939745.2309] device (tap6d0f8de4-2f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:25.233 104542 DEBUG oslo.privsep.daemon [-] privsep: reply[2cd6f304-d01b-45da-a74a-b01ea62261bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:02:25 np0005546954 systemd[1]: Started Virtual Machine qemu-20-instance-00000016.
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:25.245 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[52bf4f7f-60a0-4444-9e41-52d6bfac4dcf]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:25.271 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[b9c78be6-6206-4489-8db4-271c9c97ec7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:25.276 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[c439a052-b9f9-45ac-90d8-bd4e4a36c55a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:02:25 np0005546954 NetworkManager[55665]: <info>  [1764939745.2770] manager: (tapd4389bc8-20): new Veth device (/org/freedesktop/NetworkManager/Devices/83)
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:25.306 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[e468a769-a312-4c9d-956e-99b99f1c196b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:25.309 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[74157e74-c8f7-45d6-bf36-023227118e64]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:02:25 np0005546954 NetworkManager[55665]: <info>  [1764939745.3295] device (tapd4389bc8-20): carrier: link connected
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:25.336 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[cbca97a7-feb3-407d-bf2d-8a55cf7e8448]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:25.349 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[564a3f16-d503-41af-ae6c-8cd4dabd2c3a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd4389bc8-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:43:f7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 475195, 'reachable_time': 25527, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216245, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:25.363 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[b2b8cc64-41a8-41bc-851d-fd0387aab013]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7c:43f7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 475195, 'tstamp': 475195}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216246, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:25.380 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[cb870e95-4415-49cd-b450-b91406dd76e5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd4389bc8-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:43:f7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 475195, 'reachable_time': 25527, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216247, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:25.407 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[710822e1-224f-4cc4-8950-b57872a89f8b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:02:25 np0005546954 nova_compute[187160]: 2025-12-05 13:02:25.437 187164 DEBUG nova.compute.manager [req-5be87d7c-d55f-453e-97a4-0834c3a4cf37 req-b2bdbfd1-3cbd-4ec6-acc8-fb1487eaf71d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Received event network-vif-plugged-6d0f8de4-2f58-4be9-973b-d7c01431f90c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:02:25 np0005546954 nova_compute[187160]: 2025-12-05 13:02:25.438 187164 DEBUG oslo_concurrency.lockutils [req-5be87d7c-d55f-453e-97a4-0834c3a4cf37 req-b2bdbfd1-3cbd-4ec6-acc8-fb1487eaf71d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "4fe7206a-4625-4e1b-ad62-a53794dfe8f7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:02:25 np0005546954 nova_compute[187160]: 2025-12-05 13:02:25.438 187164 DEBUG oslo_concurrency.lockutils [req-5be87d7c-d55f-453e-97a4-0834c3a4cf37 req-b2bdbfd1-3cbd-4ec6-acc8-fb1487eaf71d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "4fe7206a-4625-4e1b-ad62-a53794dfe8f7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:02:25 np0005546954 nova_compute[187160]: 2025-12-05 13:02:25.438 187164 DEBUG oslo_concurrency.lockutils [req-5be87d7c-d55f-453e-97a4-0834c3a4cf37 req-b2bdbfd1-3cbd-4ec6-acc8-fb1487eaf71d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "4fe7206a-4625-4e1b-ad62-a53794dfe8f7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:02:25 np0005546954 nova_compute[187160]: 2025-12-05 13:02:25.439 187164 DEBUG nova.compute.manager [req-5be87d7c-d55f-453e-97a4-0834c3a4cf37 req-b2bdbfd1-3cbd-4ec6-acc8-fb1487eaf71d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Processing event network-vif-plugged-6d0f8de4-2f58-4be9-973b-d7c01431f90c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:25.461 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[8b5190c0-2537-45bd-89b2-74f7a3c58eb5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:25.462 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4389bc8-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:25.463 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:25.463 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd4389bc8-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:02:25 np0005546954 NetworkManager[55665]: <info>  [1764939745.4655] manager: (tapd4389bc8-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/84)
Dec  5 08:02:25 np0005546954 nova_compute[187160]: 2025-12-05 13:02:25.465 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:02:25 np0005546954 kernel: tapd4389bc8-20: entered promiscuous mode
Dec  5 08:02:25 np0005546954 nova_compute[187160]: 2025-12-05 13:02:25.467 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:25.469 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd4389bc8-20, col_values=(('external_ids', {'iface-id': '8dbe2af5-9f18-44ca-8f22-66854bcdd596'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:02:25 np0005546954 nova_compute[187160]: 2025-12-05 13:02:25.470 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:02:25 np0005546954 ovn_controller[95566]: 2025-12-05T13:02:25Z|00210|binding|INFO|Releasing lport 8dbe2af5-9f18-44ca-8f22-66854bcdd596 from this chassis (sb_readonly=0)
Dec  5 08:02:25 np0005546954 nova_compute[187160]: 2025-12-05 13:02:25.481 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:25.482 104428 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d4389bc8-2898-48b0-9741-5183b54fe83c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d4389bc8-2898-48b0-9741-5183b54fe83c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:25.483 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[fd5c813b-94c8-45db-a115-34e446c4fd77]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:25.483 104428 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]: global
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]:    log         /dev/log local0 debug
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]:    log-tag     haproxy-metadata-proxy-d4389bc8-2898-48b0-9741-5183b54fe83c
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]:    user        root
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]:    group       root
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]:    maxconn     1024
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]:    pidfile     /var/lib/neutron/external/pids/d4389bc8-2898-48b0-9741-5183b54fe83c.pid.haproxy
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]:    daemon
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]: 
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]: defaults
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]:    log global
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]:    mode http
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]:    option httplog
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]:    option dontlognull
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]:    option http-server-close
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]:    option forwardfor
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]:    retries                 3
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]:    timeout http-request    30s
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]:    timeout connect         30s
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]:    timeout client          32s
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]:    timeout server          32s
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]:    timeout http-keep-alive 30s
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]: 
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]: 
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]: listen listener
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]:    bind 169.254.169.254:80
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]:    http-request add-header X-OVN-Network-ID d4389bc8-2898-48b0-9741-5183b54fe83c
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 08:02:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:25.484 104428 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'env', 'PROCESS_TAG=haproxy-d4389bc8-2898-48b0-9741-5183b54fe83c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d4389bc8-2898-48b0-9741-5183b54fe83c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 08:02:25 np0005546954 nova_compute[187160]: 2025-12-05 13:02:25.558 187164 DEBUG nova.compute.manager [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 08:02:25 np0005546954 nova_compute[187160]: 2025-12-05 13:02:25.559 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764939745.557448, 4fe7206a-4625-4e1b-ad62-a53794dfe8f7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 08:02:25 np0005546954 nova_compute[187160]: 2025-12-05 13:02:25.560 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] VM Started (Lifecycle Event)#033[00m
Dec  5 08:02:25 np0005546954 nova_compute[187160]: 2025-12-05 13:02:25.569 187164 DEBUG nova.virt.libvirt.driver [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 08:02:25 np0005546954 nova_compute[187160]: 2025-12-05 13:02:25.573 187164 INFO nova.virt.libvirt.driver [-] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Instance spawned successfully.#033[00m
Dec  5 08:02:25 np0005546954 nova_compute[187160]: 2025-12-05 13:02:25.574 187164 DEBUG nova.virt.libvirt.driver [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 08:02:25 np0005546954 nova_compute[187160]: 2025-12-05 13:02:25.595 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 08:02:25 np0005546954 nova_compute[187160]: 2025-12-05 13:02:25.598 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 08:02:25 np0005546954 nova_compute[187160]: 2025-12-05 13:02:25.604 187164 DEBUG nova.virt.libvirt.driver [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 08:02:25 np0005546954 nova_compute[187160]: 2025-12-05 13:02:25.605 187164 DEBUG nova.virt.libvirt.driver [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 08:02:25 np0005546954 nova_compute[187160]: 2025-12-05 13:02:25.605 187164 DEBUG nova.virt.libvirt.driver [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 08:02:25 np0005546954 nova_compute[187160]: 2025-12-05 13:02:25.605 187164 DEBUG nova.virt.libvirt.driver [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 08:02:25 np0005546954 nova_compute[187160]: 2025-12-05 13:02:25.606 187164 DEBUG nova.virt.libvirt.driver [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 08:02:25 np0005546954 nova_compute[187160]: 2025-12-05 13:02:25.606 187164 DEBUG nova.virt.libvirt.driver [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 08:02:25 np0005546954 nova_compute[187160]: 2025-12-05 13:02:25.638 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 08:02:25 np0005546954 nova_compute[187160]: 2025-12-05 13:02:25.638 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764939745.5576324, 4fe7206a-4625-4e1b-ad62-a53794dfe8f7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 08:02:25 np0005546954 nova_compute[187160]: 2025-12-05 13:02:25.639 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] VM Paused (Lifecycle Event)#033[00m
Dec  5 08:02:25 np0005546954 nova_compute[187160]: 2025-12-05 13:02:25.673 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 08:02:25 np0005546954 nova_compute[187160]: 2025-12-05 13:02:25.677 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764939745.5688689, 4fe7206a-4625-4e1b-ad62-a53794dfe8f7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 08:02:25 np0005546954 nova_compute[187160]: 2025-12-05 13:02:25.677 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] VM Resumed (Lifecycle Event)#033[00m
Dec  5 08:02:25 np0005546954 nova_compute[187160]: 2025-12-05 13:02:25.690 187164 INFO nova.compute.manager [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Took 6.81 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 08:02:25 np0005546954 nova_compute[187160]: 2025-12-05 13:02:25.690 187164 DEBUG nova.compute.manager [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 08:02:25 np0005546954 nova_compute[187160]: 2025-12-05 13:02:25.700 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 08:02:25 np0005546954 nova_compute[187160]: 2025-12-05 13:02:25.703 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 08:02:25 np0005546954 nova_compute[187160]: 2025-12-05 13:02:25.732 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 08:02:25 np0005546954 nova_compute[187160]: 2025-12-05 13:02:25.755 187164 INFO nova.compute.manager [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Took 7.24 seconds to build instance.#033[00m
Dec  5 08:02:25 np0005546954 nova_compute[187160]: 2025-12-05 13:02:25.782 187164 DEBUG oslo_concurrency.lockutils [None req-7b30235b-d4a1-4d85-8bd7-12e00d05f570 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "4fe7206a-4625-4e1b-ad62-a53794dfe8f7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.359s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:02:25 np0005546954 podman[216285]: 2025-12-05 13:02:25.836698018 +0000 UTC m=+0.046812589 container create e48113b657ab75420ee1a65ceaa049e212a5bf1de10b8b29a7ca63af9a56e540 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec  5 08:02:25 np0005546954 systemd[1]: Started libpod-conmon-e48113b657ab75420ee1a65ceaa049e212a5bf1de10b8b29a7ca63af9a56e540.scope.
Dec  5 08:02:25 np0005546954 systemd[1]: Started libcrun container.
Dec  5 08:02:25 np0005546954 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/995972418d9cb052317ad27c7d93f306a595130b862e6f2c0002c6ed0de9077f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 08:02:25 np0005546954 podman[216285]: 2025-12-05 13:02:25.808535867 +0000 UTC m=+0.018650458 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 08:02:25 np0005546954 podman[216285]: 2025-12-05 13:02:25.909619185 +0000 UTC m=+0.119733776 container init e48113b657ab75420ee1a65ceaa049e212a5bf1de10b8b29a7ca63af9a56e540 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  5 08:02:25 np0005546954 podman[216285]: 2025-12-05 13:02:25.915134306 +0000 UTC m=+0.125248877 container start e48113b657ab75420ee1a65ceaa049e212a5bf1de10b8b29a7ca63af9a56e540 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec  5 08:02:25 np0005546954 podman[216298]: 2025-12-05 13:02:25.931488252 +0000 UTC m=+0.062909197 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., distribution-scope=public, vcs-type=git, config_id=edpm, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, managed_by=edpm_ansible)
Dec  5 08:02:25 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[216302]: [NOTICE]   (216340) : New worker (216345) forked
Dec  5 08:02:25 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[216302]: [NOTICE]   (216340) : Loading success.
Dec  5 08:02:25 np0005546954 podman[216301]: 2025-12-05 13:02:25.939062647 +0000 UTC m=+0.068214402 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec  5 08:02:25 np0005546954 nova_compute[187160]: 2025-12-05 13:02:25.944 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:02:26 np0005546954 nova_compute[187160]: 2025-12-05 13:02:26.084 187164 DEBUG nova.network.neutron [req-99198045-c720-420e-9a99-f791703707e1 req-44b889e7-34d0-47a6-81f0-87e8bd7008e0 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Updated VIF entry in instance network info cache for port 6d0f8de4-2f58-4be9-973b-d7c01431f90c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 08:02:26 np0005546954 nova_compute[187160]: 2025-12-05 13:02:26.085 187164 DEBUG nova.network.neutron [req-99198045-c720-420e-9a99-f791703707e1 req-44b889e7-34d0-47a6-81f0-87e8bd7008e0 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Updating instance_info_cache with network_info: [{"id": "6d0f8de4-2f58-4be9-973b-d7c01431f90c", "address": "fa:16:3e:3e:db:2d", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d0f8de4-2f", "ovs_interfaceid": "6d0f8de4-2f58-4be9-973b-d7c01431f90c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 08:02:26 np0005546954 nova_compute[187160]: 2025-12-05 13:02:26.230 187164 DEBUG oslo_concurrency.lockutils [req-99198045-c720-420e-9a99-f791703707e1 req-44b889e7-34d0-47a6-81f0-87e8bd7008e0 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Releasing lock "refresh_cache-4fe7206a-4625-4e1b-ad62-a53794dfe8f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 08:02:27 np0005546954 nova_compute[187160]: 2025-12-05 13:02:27.518 187164 DEBUG nova.compute.manager [req-e6f5e0d8-a6cb-405d-a6e0-480bef9fb4ea req-cd198d25-8190-4302-bde8-a9e25eaaaeee 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Received event network-vif-plugged-6d0f8de4-2f58-4be9-973b-d7c01431f90c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:02:27 np0005546954 nova_compute[187160]: 2025-12-05 13:02:27.518 187164 DEBUG oslo_concurrency.lockutils [req-e6f5e0d8-a6cb-405d-a6e0-480bef9fb4ea req-cd198d25-8190-4302-bde8-a9e25eaaaeee 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "4fe7206a-4625-4e1b-ad62-a53794dfe8f7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:02:27 np0005546954 nova_compute[187160]: 2025-12-05 13:02:27.518 187164 DEBUG oslo_concurrency.lockutils [req-e6f5e0d8-a6cb-405d-a6e0-480bef9fb4ea req-cd198d25-8190-4302-bde8-a9e25eaaaeee 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "4fe7206a-4625-4e1b-ad62-a53794dfe8f7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:02:27 np0005546954 nova_compute[187160]: 2025-12-05 13:02:27.519 187164 DEBUG oslo_concurrency.lockutils [req-e6f5e0d8-a6cb-405d-a6e0-480bef9fb4ea req-cd198d25-8190-4302-bde8-a9e25eaaaeee 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "4fe7206a-4625-4e1b-ad62-a53794dfe8f7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:02:27 np0005546954 nova_compute[187160]: 2025-12-05 13:02:27.519 187164 DEBUG nova.compute.manager [req-e6f5e0d8-a6cb-405d-a6e0-480bef9fb4ea req-cd198d25-8190-4302-bde8-a9e25eaaaeee 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] No waiting events found dispatching network-vif-plugged-6d0f8de4-2f58-4be9-973b-d7c01431f90c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 08:02:27 np0005546954 nova_compute[187160]: 2025-12-05 13:02:27.519 187164 WARNING nova.compute.manager [req-e6f5e0d8-a6cb-405d-a6e0-480bef9fb4ea req-cd198d25-8190-4302-bde8-a9e25eaaaeee 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Received unexpected event network-vif-plugged-6d0f8de4-2f58-4be9-973b-d7c01431f90c for instance with vm_state active and task_state None.#033[00m
Dec  5 08:02:29 np0005546954 nova_compute[187160]: 2025-12-05 13:02:29.476 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:02:30 np0005546954 nova_compute[187160]: 2025-12-05 13:02:30.943 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:02:32 np0005546954 nova_compute[187160]: 2025-12-05 13:02:32.554 187164 DEBUG nova.virt.libvirt.driver [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Check if temp file /var/lib/nova/instances/tmp41bx79s6 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Dec  5 08:02:32 np0005546954 nova_compute[187160]: 2025-12-05 13:02:32.555 187164 DEBUG nova.compute.manager [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp41bx79s6',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4fe7206a-4625-4e1b-ad62-a53794dfe8f7',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Dec  5 08:02:33 np0005546954 nova_compute[187160]: 2025-12-05 13:02:33.600 187164 DEBUG oslo_concurrency.processutils [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4fe7206a-4625-4e1b-ad62-a53794dfe8f7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:02:33 np0005546954 nova_compute[187160]: 2025-12-05 13:02:33.658 187164 DEBUG oslo_concurrency.processutils [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4fe7206a-4625-4e1b-ad62-a53794dfe8f7/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:02:33 np0005546954 nova_compute[187160]: 2025-12-05 13:02:33.660 187164 DEBUG oslo_concurrency.processutils [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4fe7206a-4625-4e1b-ad62-a53794dfe8f7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:02:33 np0005546954 nova_compute[187160]: 2025-12-05 13:02:33.753 187164 DEBUG oslo_concurrency.processutils [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4fe7206a-4625-4e1b-ad62-a53794dfe8f7/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:02:34 np0005546954 nova_compute[187160]: 2025-12-05 13:02:34.479 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:02:35 np0005546954 podman[197513]: time="2025-12-05T13:02:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:02:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:02:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  5 08:02:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:02:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3062 "" "Go-http-client/1.1"
Dec  5 08:02:35 np0005546954 nova_compute[187160]: 2025-12-05 13:02:35.981 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:02:36 np0005546954 systemd-logind[789]: New session 34 of user nova.
Dec  5 08:02:36 np0005546954 systemd[1]: Created slice User Slice of UID 42436.
Dec  5 08:02:36 np0005546954 systemd[1]: Starting User Runtime Directory /run/user/42436...
Dec  5 08:02:36 np0005546954 systemd[1]: Finished User Runtime Directory /run/user/42436.
Dec  5 08:02:36 np0005546954 systemd[1]: Starting User Manager for UID 42436...
Dec  5 08:02:36 np0005546954 systemd[216379]: Queued start job for default target Main User Target.
Dec  5 08:02:36 np0005546954 systemd[216379]: Created slice User Application Slice.
Dec  5 08:02:36 np0005546954 systemd[216379]: Started Mark boot as successful after the user session has run 2 minutes.
Dec  5 08:02:36 np0005546954 systemd[216379]: Started Daily Cleanup of User's Temporary Directories.
Dec  5 08:02:36 np0005546954 systemd[216379]: Reached target Paths.
Dec  5 08:02:36 np0005546954 systemd[216379]: Reached target Timers.
Dec  5 08:02:36 np0005546954 systemd[216379]: Starting D-Bus User Message Bus Socket...
Dec  5 08:02:36 np0005546954 systemd[216379]: Starting Create User's Volatile Files and Directories...
Dec  5 08:02:36 np0005546954 systemd[216379]: Finished Create User's Volatile Files and Directories.
Dec  5 08:02:36 np0005546954 systemd[216379]: Listening on D-Bus User Message Bus Socket.
Dec  5 08:02:36 np0005546954 systemd[216379]: Reached target Sockets.
Dec  5 08:02:36 np0005546954 systemd[216379]: Reached target Basic System.
Dec  5 08:02:36 np0005546954 systemd[216379]: Reached target Main User Target.
Dec  5 08:02:36 np0005546954 systemd[216379]: Startup finished in 131ms.
Dec  5 08:02:36 np0005546954 systemd[1]: Started User Manager for UID 42436.
Dec  5 08:02:36 np0005546954 systemd[1]: Started Session 34 of User nova.
Dec  5 08:02:36 np0005546954 systemd[1]: session-34.scope: Deactivated successfully.
Dec  5 08:02:36 np0005546954 systemd-logind[789]: Session 34 logged out. Waiting for processes to exit.
Dec  5 08:02:36 np0005546954 systemd-logind[789]: Removed session 34.
Dec  5 08:02:37 np0005546954 nova_compute[187160]: 2025-12-05 13:02:37.463 187164 DEBUG nova.compute.manager [req-07a4d803-878a-454c-b5b9-b18acc841f48 req-0e41c0ff-95f8-4ae8-9a3c-9f27fc5368ee 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Received event network-vif-unplugged-6d0f8de4-2f58-4be9-973b-d7c01431f90c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:02:37 np0005546954 nova_compute[187160]: 2025-12-05 13:02:37.466 187164 DEBUG oslo_concurrency.lockutils [req-07a4d803-878a-454c-b5b9-b18acc841f48 req-0e41c0ff-95f8-4ae8-9a3c-9f27fc5368ee 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "4fe7206a-4625-4e1b-ad62-a53794dfe8f7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:02:37 np0005546954 nova_compute[187160]: 2025-12-05 13:02:37.466 187164 DEBUG oslo_concurrency.lockutils [req-07a4d803-878a-454c-b5b9-b18acc841f48 req-0e41c0ff-95f8-4ae8-9a3c-9f27fc5368ee 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "4fe7206a-4625-4e1b-ad62-a53794dfe8f7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:02:37 np0005546954 nova_compute[187160]: 2025-12-05 13:02:37.467 187164 DEBUG oslo_concurrency.lockutils [req-07a4d803-878a-454c-b5b9-b18acc841f48 req-0e41c0ff-95f8-4ae8-9a3c-9f27fc5368ee 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "4fe7206a-4625-4e1b-ad62-a53794dfe8f7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:02:37 np0005546954 nova_compute[187160]: 2025-12-05 13:02:37.467 187164 DEBUG nova.compute.manager [req-07a4d803-878a-454c-b5b9-b18acc841f48 req-0e41c0ff-95f8-4ae8-9a3c-9f27fc5368ee 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] No waiting events found dispatching network-vif-unplugged-6d0f8de4-2f58-4be9-973b-d7c01431f90c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 08:02:37 np0005546954 nova_compute[187160]: 2025-12-05 13:02:37.468 187164 DEBUG nova.compute.manager [req-07a4d803-878a-454c-b5b9-b18acc841f48 req-0e41c0ff-95f8-4ae8-9a3c-9f27fc5368ee 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Received event network-vif-unplugged-6d0f8de4-2f58-4be9-973b-d7c01431f90c for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  5 08:02:38 np0005546954 ovn_controller[95566]: 2025-12-05T13:02:38Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:3e:db:2d 10.100.0.12
Dec  5 08:02:38 np0005546954 ovn_controller[95566]: 2025-12-05T13:02:38Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:3e:db:2d 10.100.0.12
Dec  5 08:02:39 np0005546954 nova_compute[187160]: 2025-12-05 13:02:39.103 187164 INFO nova.compute.manager [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Took 5.35 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.#033[00m
Dec  5 08:02:39 np0005546954 nova_compute[187160]: 2025-12-05 13:02:39.104 187164 DEBUG nova.compute.manager [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 08:02:39 np0005546954 nova_compute[187160]: 2025-12-05 13:02:39.130 187164 DEBUG nova.compute.manager [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp41bx79s6',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4fe7206a-4625-4e1b-ad62-a53794dfe8f7',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(d14e2aac-c861-4e4a-bda0-5453927b7b70),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Dec  5 08:02:39 np0005546954 nova_compute[187160]: 2025-12-05 13:02:39.154 187164 DEBUG nova.objects.instance [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Lazy-loading 'migration_context' on Instance uuid 4fe7206a-4625-4e1b-ad62-a53794dfe8f7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 08:02:39 np0005546954 nova_compute[187160]: 2025-12-05 13:02:39.156 187164 DEBUG nova.virt.libvirt.driver [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Dec  5 08:02:39 np0005546954 nova_compute[187160]: 2025-12-05 13:02:39.158 187164 DEBUG nova.virt.libvirt.driver [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Dec  5 08:02:39 np0005546954 nova_compute[187160]: 2025-12-05 13:02:39.158 187164 DEBUG nova.virt.libvirt.driver [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Dec  5 08:02:39 np0005546954 nova_compute[187160]: 2025-12-05 13:02:39.178 187164 DEBUG nova.virt.libvirt.vif [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T13:02:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-938314815',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-938314815',id=22,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T13:02:25Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e6ae0d0dcde04b85b6dae45560cca988',ramdisk_id='',reservation_id='r-bftthp2a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-192029678',owner_user_name='tempest-TestExecuteStrategies-192029678-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T13:02:25Z,user_data=None,user_id='0ae0bb20ac8b4be99eb1abddc7310436',uuid=4fe7206a-4625-4e1b-ad62-a53794dfe8f7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6d0f8de4-2f58-4be9-973b-d7c01431f90c", "address": "fa:16:3e:3e:db:2d", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap6d0f8de4-2f", "ovs_interfaceid": "6d0f8de4-2f58-4be9-973b-d7c01431f90c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 08:02:39 np0005546954 nova_compute[187160]: 2025-12-05 13:02:39.178 187164 DEBUG nova.network.os_vif_util [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Converting VIF {"id": "6d0f8de4-2f58-4be9-973b-d7c01431f90c", "address": "fa:16:3e:3e:db:2d", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap6d0f8de4-2f", "ovs_interfaceid": "6d0f8de4-2f58-4be9-973b-d7c01431f90c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 08:02:39 np0005546954 nova_compute[187160]: 2025-12-05 13:02:39.180 187164 DEBUG nova.network.os_vif_util [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3e:db:2d,bridge_name='br-int',has_traffic_filtering=True,id=6d0f8de4-2f58-4be9-973b-d7c01431f90c,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d0f8de4-2f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 08:02:39 np0005546954 nova_compute[187160]: 2025-12-05 13:02:39.181 187164 DEBUG nova.virt.libvirt.migration [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Updating guest XML with vif config: <interface type="ethernet">
Dec  5 08:02:39 np0005546954 nova_compute[187160]:  <mac address="fa:16:3e:3e:db:2d"/>
Dec  5 08:02:39 np0005546954 nova_compute[187160]:  <model type="virtio"/>
Dec  5 08:02:39 np0005546954 nova_compute[187160]:  <driver name="vhost" rx_queue_size="512"/>
Dec  5 08:02:39 np0005546954 nova_compute[187160]:  <mtu size="1442"/>
Dec  5 08:02:39 np0005546954 nova_compute[187160]:  <target dev="tap6d0f8de4-2f"/>
Dec  5 08:02:39 np0005546954 nova_compute[187160]: </interface>
Dec  5 08:02:39 np0005546954 nova_compute[187160]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Dec  5 08:02:39 np0005546954 nova_compute[187160]: 2025-12-05 13:02:39.182 187164 DEBUG nova.virt.libvirt.driver [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Dec  5 08:02:39 np0005546954 nova_compute[187160]: 2025-12-05 13:02:39.483 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:02:39 np0005546954 nova_compute[187160]: 2025-12-05 13:02:39.553 187164 DEBUG nova.compute.manager [req-2f676dd4-6eaf-4561-8374-7bea5cfb1496 req-154c3cca-d839-47fd-86d4-ee00d8b4a7cf 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Received event network-vif-plugged-6d0f8de4-2f58-4be9-973b-d7c01431f90c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:02:39 np0005546954 nova_compute[187160]: 2025-12-05 13:02:39.555 187164 DEBUG oslo_concurrency.lockutils [req-2f676dd4-6eaf-4561-8374-7bea5cfb1496 req-154c3cca-d839-47fd-86d4-ee00d8b4a7cf 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "4fe7206a-4625-4e1b-ad62-a53794dfe8f7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:02:39 np0005546954 nova_compute[187160]: 2025-12-05 13:02:39.555 187164 DEBUG oslo_concurrency.lockutils [req-2f676dd4-6eaf-4561-8374-7bea5cfb1496 req-154c3cca-d839-47fd-86d4-ee00d8b4a7cf 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "4fe7206a-4625-4e1b-ad62-a53794dfe8f7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:02:39 np0005546954 nova_compute[187160]: 2025-12-05 13:02:39.555 187164 DEBUG oslo_concurrency.lockutils [req-2f676dd4-6eaf-4561-8374-7bea5cfb1496 req-154c3cca-d839-47fd-86d4-ee00d8b4a7cf 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "4fe7206a-4625-4e1b-ad62-a53794dfe8f7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:02:39 np0005546954 nova_compute[187160]: 2025-12-05 13:02:39.556 187164 DEBUG nova.compute.manager [req-2f676dd4-6eaf-4561-8374-7bea5cfb1496 req-154c3cca-d839-47fd-86d4-ee00d8b4a7cf 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] No waiting events found dispatching network-vif-plugged-6d0f8de4-2f58-4be9-973b-d7c01431f90c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 08:02:39 np0005546954 nova_compute[187160]: 2025-12-05 13:02:39.556 187164 WARNING nova.compute.manager [req-2f676dd4-6eaf-4561-8374-7bea5cfb1496 req-154c3cca-d839-47fd-86d4-ee00d8b4a7cf 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Received unexpected event network-vif-plugged-6d0f8de4-2f58-4be9-973b-d7c01431f90c for instance with vm_state active and task_state migrating.#033[00m
Dec  5 08:02:39 np0005546954 nova_compute[187160]: 2025-12-05 13:02:39.556 187164 DEBUG nova.compute.manager [req-2f676dd4-6eaf-4561-8374-7bea5cfb1496 req-154c3cca-d839-47fd-86d4-ee00d8b4a7cf 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Received event network-changed-6d0f8de4-2f58-4be9-973b-d7c01431f90c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:02:39 np0005546954 nova_compute[187160]: 2025-12-05 13:02:39.556 187164 DEBUG nova.compute.manager [req-2f676dd4-6eaf-4561-8374-7bea5cfb1496 req-154c3cca-d839-47fd-86d4-ee00d8b4a7cf 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Refreshing instance network info cache due to event network-changed-6d0f8de4-2f58-4be9-973b-d7c01431f90c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 08:02:39 np0005546954 nova_compute[187160]: 2025-12-05 13:02:39.556 187164 DEBUG oslo_concurrency.lockutils [req-2f676dd4-6eaf-4561-8374-7bea5cfb1496 req-154c3cca-d839-47fd-86d4-ee00d8b4a7cf 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "refresh_cache-4fe7206a-4625-4e1b-ad62-a53794dfe8f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 08:02:39 np0005546954 nova_compute[187160]: 2025-12-05 13:02:39.557 187164 DEBUG oslo_concurrency.lockutils [req-2f676dd4-6eaf-4561-8374-7bea5cfb1496 req-154c3cca-d839-47fd-86d4-ee00d8b4a7cf 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquired lock "refresh_cache-4fe7206a-4625-4e1b-ad62-a53794dfe8f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 08:02:39 np0005546954 nova_compute[187160]: 2025-12-05 13:02:39.557 187164 DEBUG nova.network.neutron [req-2f676dd4-6eaf-4561-8374-7bea5cfb1496 req-154c3cca-d839-47fd-86d4-ee00d8b4a7cf 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Refreshing network info cache for port 6d0f8de4-2f58-4be9-973b-d7c01431f90c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 08:02:39 np0005546954 podman[216397]: 2025-12-05 13:02:39.588440446 +0000 UTC m=+0.088191840 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec  5 08:02:39 np0005546954 nova_compute[187160]: 2025-12-05 13:02:39.661 187164 DEBUG nova.virt.libvirt.migration [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Dec  5 08:02:39 np0005546954 nova_compute[187160]: 2025-12-05 13:02:39.662 187164 INFO nova.virt.libvirt.migration [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Dec  5 08:02:39 np0005546954 nova_compute[187160]: 2025-12-05 13:02:39.734 187164 INFO nova.virt.libvirt.driver [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Dec  5 08:02:40 np0005546954 nova_compute[187160]: 2025-12-05 13:02:40.238 187164 DEBUG nova.virt.libvirt.migration [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Dec  5 08:02:40 np0005546954 nova_compute[187160]: 2025-12-05 13:02:40.239 187164 DEBUG nova.virt.libvirt.migration [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Dec  5 08:02:40 np0005546954 nova_compute[187160]: 2025-12-05 13:02:40.590 187164 DEBUG nova.network.neutron [req-2f676dd4-6eaf-4561-8374-7bea5cfb1496 req-154c3cca-d839-47fd-86d4-ee00d8b4a7cf 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Updated VIF entry in instance network info cache for port 6d0f8de4-2f58-4be9-973b-d7c01431f90c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 08:02:40 np0005546954 nova_compute[187160]: 2025-12-05 13:02:40.591 187164 DEBUG nova.network.neutron [req-2f676dd4-6eaf-4561-8374-7bea5cfb1496 req-154c3cca-d839-47fd-86d4-ee00d8b4a7cf 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Updating instance_info_cache with network_info: [{"id": "6d0f8de4-2f58-4be9-973b-d7c01431f90c", "address": "fa:16:3e:3e:db:2d", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d0f8de4-2f", "ovs_interfaceid": "6d0f8de4-2f58-4be9-973b-d7c01431f90c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 08:02:40 np0005546954 nova_compute[187160]: 2025-12-05 13:02:40.743 187164 DEBUG nova.virt.libvirt.migration [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Dec  5 08:02:40 np0005546954 nova_compute[187160]: 2025-12-05 13:02:40.744 187164 DEBUG nova.virt.libvirt.migration [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Dec  5 08:02:40 np0005546954 nova_compute[187160]: 2025-12-05 13:02:40.982 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:02:41 np0005546954 nova_compute[187160]: 2025-12-05 13:02:41.247 187164 DEBUG nova.virt.libvirt.migration [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Dec  5 08:02:41 np0005546954 nova_compute[187160]: 2025-12-05 13:02:41.247 187164 DEBUG nova.virt.libvirt.migration [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Dec  5 08:02:41 np0005546954 nova_compute[187160]: 2025-12-05 13:02:41.752 187164 DEBUG nova.virt.libvirt.migration [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Dec  5 08:02:41 np0005546954 nova_compute[187160]: 2025-12-05 13:02:41.752 187164 DEBUG nova.virt.libvirt.migration [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Dec  5 08:02:42 np0005546954 nova_compute[187160]: 2025-12-05 13:02:42.030 187164 DEBUG oslo_concurrency.lockutils [req-2f676dd4-6eaf-4561-8374-7bea5cfb1496 req-154c3cca-d839-47fd-86d4-ee00d8b4a7cf 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Releasing lock "refresh_cache-4fe7206a-4625-4e1b-ad62-a53794dfe8f7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 08:02:42 np0005546954 nova_compute[187160]: 2025-12-05 13:02:42.256 187164 DEBUG nova.virt.libvirt.migration [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Dec  5 08:02:42 np0005546954 nova_compute[187160]: 2025-12-05 13:02:42.256 187164 DEBUG nova.virt.libvirt.migration [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Dec  5 08:02:42 np0005546954 nova_compute[187160]: 2025-12-05 13:02:42.611 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764939762.6112766, 4fe7206a-4625-4e1b-ad62-a53794dfe8f7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 08:02:42 np0005546954 nova_compute[187160]: 2025-12-05 13:02:42.612 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] VM Paused (Lifecycle Event)#033[00m
Dec  5 08:02:42 np0005546954 nova_compute[187160]: 2025-12-05 13:02:42.761 187164 DEBUG nova.virt.libvirt.migration [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Dec  5 08:02:42 np0005546954 nova_compute[187160]: 2025-12-05 13:02:42.761 187164 DEBUG nova.virt.libvirt.migration [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Dec  5 08:02:42 np0005546954 nova_compute[187160]: 2025-12-05 13:02:42.768 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 08:02:42 np0005546954 nova_compute[187160]: 2025-12-05 13:02:42.773 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 08:02:42 np0005546954 kernel: tap6d0f8de4-2f (unregistering): left promiscuous mode
Dec  5 08:02:42 np0005546954 NetworkManager[55665]: <info>  [1764939762.8120] device (tap6d0f8de4-2f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 08:02:42 np0005546954 ovn_controller[95566]: 2025-12-05T13:02:42Z|00211|binding|INFO|Releasing lport 6d0f8de4-2f58-4be9-973b-d7c01431f90c from this chassis (sb_readonly=0)
Dec  5 08:02:42 np0005546954 nova_compute[187160]: 2025-12-05 13:02:42.862 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Dec  5 08:02:42 np0005546954 ovn_controller[95566]: 2025-12-05T13:02:42Z|00212|binding|INFO|Setting lport 6d0f8de4-2f58-4be9-973b-d7c01431f90c down in Southbound
Dec  5 08:02:42 np0005546954 nova_compute[187160]: 2025-12-05 13:02:42.862 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:02:42 np0005546954 ovn_controller[95566]: 2025-12-05T13:02:42Z|00213|binding|INFO|Removing iface tap6d0f8de4-2f ovn-installed in OVS
Dec  5 08:02:42 np0005546954 nova_compute[187160]: 2025-12-05 13:02:42.865 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:02:42 np0005546954 nova_compute[187160]: 2025-12-05 13:02:42.889 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:02:42 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:42.935 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3e:db:2d 10.100.0.12'], port_security=['fa:16:3e:3e:db:2d 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'b049cde7-59fd-4961-9791-d49d79184b2c'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '4fe7206a-4625-4e1b-ad62-a53794dfe8f7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4389bc8-2898-48b0-9741-5183b54fe83c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e6ae0d0dcde04b85b6dae45560cca988', 'neutron:revision_number': '8', 'neutron:security_group_ids': '9ea68f98-ae7c-4c35-bc5a-7c1a27f7e5f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb60c317-acba-4c06-b29b-f7c6c7a5660a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=6d0f8de4-2f58-4be9-973b-d7c01431f90c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 08:02:42 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:42.938 104428 INFO neutron.agent.ovn.metadata.agent [-] Port 6d0f8de4-2f58-4be9-973b-d7c01431f90c in datapath d4389bc8-2898-48b0-9741-5183b54fe83c unbound from our chassis#033[00m
Dec  5 08:02:42 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:42.940 104428 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d4389bc8-2898-48b0-9741-5183b54fe83c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 08:02:42 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:42.942 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[08c7bf5b-3a88-4a92-9dd2-d17a8b41ece1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:02:42 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:42.943 104428 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c namespace which is not needed anymore#033[00m
Dec  5 08:02:42 np0005546954 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000016.scope: Deactivated successfully.
Dec  5 08:02:42 np0005546954 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000016.scope: Consumed 12.572s CPU time.
Dec  5 08:02:42 np0005546954 systemd-machined[153497]: Machine qemu-20-instance-00000016 terminated.
Dec  5 08:02:43 np0005546954 nova_compute[187160]: 2025-12-05 13:02:43.047 187164 DEBUG nova.virt.libvirt.driver [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Dec  5 08:02:43 np0005546954 nova_compute[187160]: 2025-12-05 13:02:43.048 187164 DEBUG nova.virt.libvirt.driver [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Dec  5 08:02:43 np0005546954 nova_compute[187160]: 2025-12-05 13:02:43.048 187164 DEBUG nova.virt.libvirt.driver [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Dec  5 08:02:43 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[216302]: [NOTICE]   (216340) : haproxy version is 2.8.14-c23fe91
Dec  5 08:02:43 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[216302]: [NOTICE]   (216340) : path to executable is /usr/sbin/haproxy
Dec  5 08:02:43 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[216302]: [WARNING]  (216340) : Exiting Master process...
Dec  5 08:02:43 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[216302]: [WARNING]  (216340) : Exiting Master process...
Dec  5 08:02:43 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[216302]: [ALERT]    (216340) : Current worker (216345) exited with code 143 (Terminated)
Dec  5 08:02:43 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[216302]: [WARNING]  (216340) : All workers exited. Exiting... (0)
Dec  5 08:02:43 np0005546954 systemd[1]: libpod-e48113b657ab75420ee1a65ceaa049e212a5bf1de10b8b29a7ca63af9a56e540.scope: Deactivated successfully.
Dec  5 08:02:43 np0005546954 podman[216465]: 2025-12-05 13:02:43.134074355 +0000 UTC m=+0.060044480 container died e48113b657ab75420ee1a65ceaa049e212a5bf1de10b8b29a7ca63af9a56e540 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec  5 08:02:43 np0005546954 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e48113b657ab75420ee1a65ceaa049e212a5bf1de10b8b29a7ca63af9a56e540-userdata-shm.mount: Deactivated successfully.
Dec  5 08:02:43 np0005546954 systemd[1]: var-lib-containers-storage-overlay-995972418d9cb052317ad27c7d93f306a595130b862e6f2c0002c6ed0de9077f-merged.mount: Deactivated successfully.
Dec  5 08:02:43 np0005546954 podman[216465]: 2025-12-05 13:02:43.186774316 +0000 UTC m=+0.112744411 container cleanup e48113b657ab75420ee1a65ceaa049e212a5bf1de10b8b29a7ca63af9a56e540 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 08:02:43 np0005546954 systemd[1]: libpod-conmon-e48113b657ab75420ee1a65ceaa049e212a5bf1de10b8b29a7ca63af9a56e540.scope: Deactivated successfully.
Dec  5 08:02:43 np0005546954 nova_compute[187160]: 2025-12-05 13:02:43.266 187164 DEBUG nova.virt.libvirt.guest [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '4fe7206a-4625-4e1b-ad62-a53794dfe8f7' (instance-00000016) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Dec  5 08:02:43 np0005546954 nova_compute[187160]: 2025-12-05 13:02:43.268 187164 INFO nova.virt.libvirt.driver [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Migration operation has completed#033[00m
Dec  5 08:02:43 np0005546954 nova_compute[187160]: 2025-12-05 13:02:43.268 187164 INFO nova.compute.manager [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] _post_live_migration() is started..#033[00m
Dec  5 08:02:43 np0005546954 podman[216496]: 2025-12-05 13:02:43.280783445 +0000 UTC m=+0.061485184 container remove e48113b657ab75420ee1a65ceaa049e212a5bf1de10b8b29a7ca63af9a56e540 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  5 08:02:43 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:43.290 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[7e0ff652-df49-4bca-a0d7-325b08e639b1]: (4, ('Fri Dec  5 01:02:43 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c (e48113b657ab75420ee1a65ceaa049e212a5bf1de10b8b29a7ca63af9a56e540)\ne48113b657ab75420ee1a65ceaa049e212a5bf1de10b8b29a7ca63af9a56e540\nFri Dec  5 01:02:43 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c (e48113b657ab75420ee1a65ceaa049e212a5bf1de10b8b29a7ca63af9a56e540)\ne48113b657ab75420ee1a65ceaa049e212a5bf1de10b8b29a7ca63af9a56e540\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:02:43 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:43.292 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[1195057e-81ce-46bb-82f4-9ab078e91c2e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:02:43 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:43.293 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4389bc8-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:02:43 np0005546954 nova_compute[187160]: 2025-12-05 13:02:43.295 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:02:43 np0005546954 kernel: tapd4389bc8-20: left promiscuous mode
Dec  5 08:02:43 np0005546954 nova_compute[187160]: 2025-12-05 13:02:43.326 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:02:43 np0005546954 nova_compute[187160]: 2025-12-05 13:02:43.327 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:02:43 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:43.329 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[ff5f90df-666f-46d5-b30f-fa9c0e89ceb9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:02:43 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:43.350 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[207aed44-d5fc-4252-867c-8121b5880812]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:02:43 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:43.352 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[8a7dfefa-0627-43f0-928b-754cc4b45c3b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:02:43 np0005546954 nova_compute[187160]: 2025-12-05 13:02:43.360 187164 DEBUG nova.compute.manager [req-07d31069-a0fe-446d-ae44-0c59f2e1e1d2 req-56702aba-e3ca-456b-b341-011d6d40ceec 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Received event network-vif-unplugged-6d0f8de4-2f58-4be9-973b-d7c01431f90c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:02:43 np0005546954 nova_compute[187160]: 2025-12-05 13:02:43.361 187164 DEBUG oslo_concurrency.lockutils [req-07d31069-a0fe-446d-ae44-0c59f2e1e1d2 req-56702aba-e3ca-456b-b341-011d6d40ceec 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "4fe7206a-4625-4e1b-ad62-a53794dfe8f7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:02:43 np0005546954 nova_compute[187160]: 2025-12-05 13:02:43.361 187164 DEBUG oslo_concurrency.lockutils [req-07d31069-a0fe-446d-ae44-0c59f2e1e1d2 req-56702aba-e3ca-456b-b341-011d6d40ceec 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "4fe7206a-4625-4e1b-ad62-a53794dfe8f7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:02:43 np0005546954 nova_compute[187160]: 2025-12-05 13:02:43.361 187164 DEBUG oslo_concurrency.lockutils [req-07d31069-a0fe-446d-ae44-0c59f2e1e1d2 req-56702aba-e3ca-456b-b341-011d6d40ceec 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "4fe7206a-4625-4e1b-ad62-a53794dfe8f7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:02:43 np0005546954 nova_compute[187160]: 2025-12-05 13:02:43.362 187164 DEBUG nova.compute.manager [req-07d31069-a0fe-446d-ae44-0c59f2e1e1d2 req-56702aba-e3ca-456b-b341-011d6d40ceec 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] No waiting events found dispatching network-vif-unplugged-6d0f8de4-2f58-4be9-973b-d7c01431f90c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 08:02:43 np0005546954 nova_compute[187160]: 2025-12-05 13:02:43.362 187164 DEBUG nova.compute.manager [req-07d31069-a0fe-446d-ae44-0c59f2e1e1d2 req-56702aba-e3ca-456b-b341-011d6d40ceec 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Received event network-vif-unplugged-6d0f8de4-2f58-4be9-973b-d7c01431f90c for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  5 08:02:43 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:43.369 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[9c1ba25a-6408-4fa4-82ae-37eaaee14a26]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 475189, 'reachable_time': 15320, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216514, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:02:43 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:43.373 104542 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 08:02:43 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:02:43.373 104542 DEBUG oslo.privsep.daemon [-] privsep: reply[48a6794f-a94a-41be-ab2b-80ccb2414897]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:02:43 np0005546954 systemd[1]: run-netns-ovnmeta\x2dd4389bc8\x2d2898\x2d48b0\x2d9741\x2d5183b54fe83c.mount: Deactivated successfully.
Dec  5 08:02:44 np0005546954 nova_compute[187160]: 2025-12-05 13:02:44.486 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:02:44 np0005546954 nova_compute[187160]: 2025-12-05 13:02:44.512 187164 DEBUG nova.compute.manager [req-1270b314-525c-4bd1-92a4-b69b28a8879c req-b1780b08-a5e9-4eec-bd32-dbae51a4c90f 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Received event network-vif-unplugged-6d0f8de4-2f58-4be9-973b-d7c01431f90c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:02:44 np0005546954 nova_compute[187160]: 2025-12-05 13:02:44.512 187164 DEBUG oslo_concurrency.lockutils [req-1270b314-525c-4bd1-92a4-b69b28a8879c req-b1780b08-a5e9-4eec-bd32-dbae51a4c90f 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "4fe7206a-4625-4e1b-ad62-a53794dfe8f7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:02:44 np0005546954 nova_compute[187160]: 2025-12-05 13:02:44.513 187164 DEBUG oslo_concurrency.lockutils [req-1270b314-525c-4bd1-92a4-b69b28a8879c req-b1780b08-a5e9-4eec-bd32-dbae51a4c90f 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "4fe7206a-4625-4e1b-ad62-a53794dfe8f7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:02:44 np0005546954 nova_compute[187160]: 2025-12-05 13:02:44.513 187164 DEBUG oslo_concurrency.lockutils [req-1270b314-525c-4bd1-92a4-b69b28a8879c req-b1780b08-a5e9-4eec-bd32-dbae51a4c90f 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "4fe7206a-4625-4e1b-ad62-a53794dfe8f7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:02:44 np0005546954 nova_compute[187160]: 2025-12-05 13:02:44.513 187164 DEBUG nova.compute.manager [req-1270b314-525c-4bd1-92a4-b69b28a8879c req-b1780b08-a5e9-4eec-bd32-dbae51a4c90f 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] No waiting events found dispatching network-vif-unplugged-6d0f8de4-2f58-4be9-973b-d7c01431f90c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 08:02:44 np0005546954 nova_compute[187160]: 2025-12-05 13:02:44.514 187164 DEBUG nova.compute.manager [req-1270b314-525c-4bd1-92a4-b69b28a8879c req-b1780b08-a5e9-4eec-bd32-dbae51a4c90f 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Received event network-vif-unplugged-6d0f8de4-2f58-4be9-973b-d7c01431f90c for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  5 08:02:44 np0005546954 podman[216516]: 2025-12-05 13:02:44.586561429 +0000 UTC m=+0.077499101 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 08:02:44 np0005546954 podman[216515]: 2025-12-05 13:02:44.617234788 +0000 UTC m=+0.118200030 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125)
Dec  5 08:02:45 np0005546954 nova_compute[187160]: 2025-12-05 13:02:45.078 187164 DEBUG nova.network.neutron [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Activated binding for port 6d0f8de4-2f58-4be9-973b-d7c01431f90c and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Dec  5 08:02:45 np0005546954 nova_compute[187160]: 2025-12-05 13:02:45.078 187164 DEBUG nova.compute.manager [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "6d0f8de4-2f58-4be9-973b-d7c01431f90c", "address": "fa:16:3e:3e:db:2d", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d0f8de4-2f", "ovs_interfaceid": "6d0f8de4-2f58-4be9-973b-d7c01431f90c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Dec  5 08:02:45 np0005546954 nova_compute[187160]: 2025-12-05 13:02:45.080 187164 DEBUG nova.virt.libvirt.vif [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T13:02:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-938314815',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-938314815',id=22,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T13:02:25Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e6ae0d0dcde04b85b6dae45560cca988',ramdisk_id='',reservation_id='r-bftthp2a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-192029678',owner_user_name='tempest-TestExecuteStrategies-192029678-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T13:02:29Z,user_data=None,user_id='0ae0bb20ac8b4be99eb1abddc7310436',uuid=4fe7206a-4625-4e1b-ad62-a53794dfe8f7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6d0f8de4-2f58-4be9-973b-d7c01431f90c", "address": "fa:16:3e:3e:db:2d", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d0f8de4-2f", "ovs_interfaceid": "6d0f8de4-2f58-4be9-973b-d7c01431f90c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 08:02:45 np0005546954 nova_compute[187160]: 2025-12-05 13:02:45.080 187164 DEBUG nova.network.os_vif_util [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Converting VIF {"id": "6d0f8de4-2f58-4be9-973b-d7c01431f90c", "address": "fa:16:3e:3e:db:2d", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d0f8de4-2f", "ovs_interfaceid": "6d0f8de4-2f58-4be9-973b-d7c01431f90c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 08:02:45 np0005546954 nova_compute[187160]: 2025-12-05 13:02:45.081 187164 DEBUG nova.network.os_vif_util [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3e:db:2d,bridge_name='br-int',has_traffic_filtering=True,id=6d0f8de4-2f58-4be9-973b-d7c01431f90c,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d0f8de4-2f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 08:02:45 np0005546954 nova_compute[187160]: 2025-12-05 13:02:45.082 187164 DEBUG os_vif [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3e:db:2d,bridge_name='br-int',has_traffic_filtering=True,id=6d0f8de4-2f58-4be9-973b-d7c01431f90c,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d0f8de4-2f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 08:02:45 np0005546954 nova_compute[187160]: 2025-12-05 13:02:45.085 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:02:45 np0005546954 nova_compute[187160]: 2025-12-05 13:02:45.086 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6d0f8de4-2f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:02:45 np0005546954 nova_compute[187160]: 2025-12-05 13:02:45.088 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:02:45 np0005546954 nova_compute[187160]: 2025-12-05 13:02:45.090 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:02:45 np0005546954 nova_compute[187160]: 2025-12-05 13:02:45.094 187164 INFO os_vif [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3e:db:2d,bridge_name='br-int',has_traffic_filtering=True,id=6d0f8de4-2f58-4be9-973b-d7c01431f90c,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d0f8de4-2f')#033[00m
Dec  5 08:02:45 np0005546954 nova_compute[187160]: 2025-12-05 13:02:45.094 187164 DEBUG oslo_concurrency.lockutils [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:02:45 np0005546954 nova_compute[187160]: 2025-12-05 13:02:45.095 187164 DEBUG oslo_concurrency.lockutils [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:02:45 np0005546954 nova_compute[187160]: 2025-12-05 13:02:45.095 187164 DEBUG oslo_concurrency.lockutils [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:02:45 np0005546954 nova_compute[187160]: 2025-12-05 13:02:45.095 187164 DEBUG nova.compute.manager [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Dec  5 08:02:45 np0005546954 nova_compute[187160]: 2025-12-05 13:02:45.095 187164 INFO nova.virt.libvirt.driver [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Deleting instance files /var/lib/nova/instances/4fe7206a-4625-4e1b-ad62-a53794dfe8f7_del#033[00m
Dec  5 08:02:45 np0005546954 nova_compute[187160]: 2025-12-05 13:02:45.096 187164 INFO nova.virt.libvirt.driver [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Deletion of /var/lib/nova/instances/4fe7206a-4625-4e1b-ad62-a53794dfe8f7_del complete#033[00m
Dec  5 08:02:45 np0005546954 nova_compute[187160]: 2025-12-05 13:02:45.536 187164 DEBUG nova.compute.manager [req-13ef5593-ad57-47d7-bd94-53993c32054c req-b07782dc-9e69-4036-9efc-8ca4a229abf8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Received event network-vif-plugged-6d0f8de4-2f58-4be9-973b-d7c01431f90c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:02:45 np0005546954 nova_compute[187160]: 2025-12-05 13:02:45.537 187164 DEBUG oslo_concurrency.lockutils [req-13ef5593-ad57-47d7-bd94-53993c32054c req-b07782dc-9e69-4036-9efc-8ca4a229abf8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "4fe7206a-4625-4e1b-ad62-a53794dfe8f7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:02:45 np0005546954 nova_compute[187160]: 2025-12-05 13:02:45.538 187164 DEBUG oslo_concurrency.lockutils [req-13ef5593-ad57-47d7-bd94-53993c32054c req-b07782dc-9e69-4036-9efc-8ca4a229abf8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "4fe7206a-4625-4e1b-ad62-a53794dfe8f7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:02:45 np0005546954 nova_compute[187160]: 2025-12-05 13:02:45.538 187164 DEBUG oslo_concurrency.lockutils [req-13ef5593-ad57-47d7-bd94-53993c32054c req-b07782dc-9e69-4036-9efc-8ca4a229abf8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "4fe7206a-4625-4e1b-ad62-a53794dfe8f7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:02:45 np0005546954 nova_compute[187160]: 2025-12-05 13:02:45.539 187164 DEBUG nova.compute.manager [req-13ef5593-ad57-47d7-bd94-53993c32054c req-b07782dc-9e69-4036-9efc-8ca4a229abf8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] No waiting events found dispatching network-vif-plugged-6d0f8de4-2f58-4be9-973b-d7c01431f90c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 08:02:45 np0005546954 nova_compute[187160]: 2025-12-05 13:02:45.539 187164 WARNING nova.compute.manager [req-13ef5593-ad57-47d7-bd94-53993c32054c req-b07782dc-9e69-4036-9efc-8ca4a229abf8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Received unexpected event network-vif-plugged-6d0f8de4-2f58-4be9-973b-d7c01431f90c for instance with vm_state active and task_state migrating.#033[00m
Dec  5 08:02:45 np0005546954 nova_compute[187160]: 2025-12-05 13:02:45.540 187164 DEBUG nova.compute.manager [req-13ef5593-ad57-47d7-bd94-53993c32054c req-b07782dc-9e69-4036-9efc-8ca4a229abf8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Received event network-vif-plugged-6d0f8de4-2f58-4be9-973b-d7c01431f90c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:02:45 np0005546954 nova_compute[187160]: 2025-12-05 13:02:45.540 187164 DEBUG oslo_concurrency.lockutils [req-13ef5593-ad57-47d7-bd94-53993c32054c req-b07782dc-9e69-4036-9efc-8ca4a229abf8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "4fe7206a-4625-4e1b-ad62-a53794dfe8f7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:02:45 np0005546954 nova_compute[187160]: 2025-12-05 13:02:45.541 187164 DEBUG oslo_concurrency.lockutils [req-13ef5593-ad57-47d7-bd94-53993c32054c req-b07782dc-9e69-4036-9efc-8ca4a229abf8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "4fe7206a-4625-4e1b-ad62-a53794dfe8f7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:02:45 np0005546954 nova_compute[187160]: 2025-12-05 13:02:45.542 187164 DEBUG oslo_concurrency.lockutils [req-13ef5593-ad57-47d7-bd94-53993c32054c req-b07782dc-9e69-4036-9efc-8ca4a229abf8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "4fe7206a-4625-4e1b-ad62-a53794dfe8f7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:02:45 np0005546954 nova_compute[187160]: 2025-12-05 13:02:45.542 187164 DEBUG nova.compute.manager [req-13ef5593-ad57-47d7-bd94-53993c32054c req-b07782dc-9e69-4036-9efc-8ca4a229abf8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] No waiting events found dispatching network-vif-plugged-6d0f8de4-2f58-4be9-973b-d7c01431f90c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 08:02:45 np0005546954 nova_compute[187160]: 2025-12-05 13:02:45.543 187164 WARNING nova.compute.manager [req-13ef5593-ad57-47d7-bd94-53993c32054c req-b07782dc-9e69-4036-9efc-8ca4a229abf8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Received unexpected event network-vif-plugged-6d0f8de4-2f58-4be9-973b-d7c01431f90c for instance with vm_state active and task_state migrating.#033[00m
Dec  5 08:02:45 np0005546954 nova_compute[187160]: 2025-12-05 13:02:45.543 187164 DEBUG nova.compute.manager [req-13ef5593-ad57-47d7-bd94-53993c32054c req-b07782dc-9e69-4036-9efc-8ca4a229abf8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Received event network-vif-plugged-6d0f8de4-2f58-4be9-973b-d7c01431f90c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:02:45 np0005546954 nova_compute[187160]: 2025-12-05 13:02:45.544 187164 DEBUG oslo_concurrency.lockutils [req-13ef5593-ad57-47d7-bd94-53993c32054c req-b07782dc-9e69-4036-9efc-8ca4a229abf8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "4fe7206a-4625-4e1b-ad62-a53794dfe8f7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:02:45 np0005546954 nova_compute[187160]: 2025-12-05 13:02:45.544 187164 DEBUG oslo_concurrency.lockutils [req-13ef5593-ad57-47d7-bd94-53993c32054c req-b07782dc-9e69-4036-9efc-8ca4a229abf8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "4fe7206a-4625-4e1b-ad62-a53794dfe8f7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:02:45 np0005546954 nova_compute[187160]: 2025-12-05 13:02:45.545 187164 DEBUG oslo_concurrency.lockutils [req-13ef5593-ad57-47d7-bd94-53993c32054c req-b07782dc-9e69-4036-9efc-8ca4a229abf8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "4fe7206a-4625-4e1b-ad62-a53794dfe8f7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:02:45 np0005546954 nova_compute[187160]: 2025-12-05 13:02:45.545 187164 DEBUG nova.compute.manager [req-13ef5593-ad57-47d7-bd94-53993c32054c req-b07782dc-9e69-4036-9efc-8ca4a229abf8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] No waiting events found dispatching network-vif-plugged-6d0f8de4-2f58-4be9-973b-d7c01431f90c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 08:02:45 np0005546954 nova_compute[187160]: 2025-12-05 13:02:45.546 187164 WARNING nova.compute.manager [req-13ef5593-ad57-47d7-bd94-53993c32054c req-b07782dc-9e69-4036-9efc-8ca4a229abf8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Received unexpected event network-vif-plugged-6d0f8de4-2f58-4be9-973b-d7c01431f90c for instance with vm_state active and task_state migrating.#033[00m
Dec  5 08:02:45 np0005546954 nova_compute[187160]: 2025-12-05 13:02:45.985 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:02:46 np0005546954 systemd[1]: Stopping User Manager for UID 42436...
Dec  5 08:02:46 np0005546954 systemd[216379]: Activating special unit Exit the Session...
Dec  5 08:02:46 np0005546954 systemd[216379]: Stopped target Main User Target.
Dec  5 08:02:46 np0005546954 systemd[216379]: Stopped target Basic System.
Dec  5 08:02:46 np0005546954 systemd[216379]: Stopped target Paths.
Dec  5 08:02:46 np0005546954 systemd[216379]: Stopped target Sockets.
Dec  5 08:02:46 np0005546954 systemd[216379]: Stopped target Timers.
Dec  5 08:02:46 np0005546954 systemd[216379]: Stopped Mark boot as successful after the user session has run 2 minutes.
Dec  5 08:02:46 np0005546954 systemd[216379]: Stopped Daily Cleanup of User's Temporary Directories.
Dec  5 08:02:46 np0005546954 systemd[216379]: Closed D-Bus User Message Bus Socket.
Dec  5 08:02:46 np0005546954 systemd[216379]: Stopped Create User's Volatile Files and Directories.
Dec  5 08:02:46 np0005546954 systemd[216379]: Removed slice User Application Slice.
Dec  5 08:02:46 np0005546954 systemd[216379]: Reached target Shutdown.
Dec  5 08:02:46 np0005546954 systemd[216379]: Finished Exit the Session.
Dec  5 08:02:46 np0005546954 systemd[216379]: Reached target Exit the Session.
Dec  5 08:02:46 np0005546954 systemd[1]: user@42436.service: Deactivated successfully.
Dec  5 08:02:46 np0005546954 systemd[1]: Stopped User Manager for UID 42436.
Dec  5 08:02:46 np0005546954 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Dec  5 08:02:47 np0005546954 systemd[1]: run-user-42436.mount: Deactivated successfully.
Dec  5 08:02:47 np0005546954 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Dec  5 08:02:47 np0005546954 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Dec  5 08:02:47 np0005546954 systemd[1]: Removed slice User Slice of UID 42436.
Dec  5 08:02:47 np0005546954 nova_compute[187160]: 2025-12-05 13:02:47.819 187164 DEBUG nova.compute.manager [req-20e24e1b-9cc3-4027-8796-cca80bb20ee2 req-ca49329a-5a89-433d-889b-823ad2b6a731 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Received event network-vif-plugged-6d0f8de4-2f58-4be9-973b-d7c01431f90c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:02:47 np0005546954 nova_compute[187160]: 2025-12-05 13:02:47.821 187164 DEBUG oslo_concurrency.lockutils [req-20e24e1b-9cc3-4027-8796-cca80bb20ee2 req-ca49329a-5a89-433d-889b-823ad2b6a731 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "4fe7206a-4625-4e1b-ad62-a53794dfe8f7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:02:47 np0005546954 nova_compute[187160]: 2025-12-05 13:02:47.822 187164 DEBUG oslo_concurrency.lockutils [req-20e24e1b-9cc3-4027-8796-cca80bb20ee2 req-ca49329a-5a89-433d-889b-823ad2b6a731 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "4fe7206a-4625-4e1b-ad62-a53794dfe8f7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:02:47 np0005546954 nova_compute[187160]: 2025-12-05 13:02:47.822 187164 DEBUG oslo_concurrency.lockutils [req-20e24e1b-9cc3-4027-8796-cca80bb20ee2 req-ca49329a-5a89-433d-889b-823ad2b6a731 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "4fe7206a-4625-4e1b-ad62-a53794dfe8f7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:02:47 np0005546954 nova_compute[187160]: 2025-12-05 13:02:47.822 187164 DEBUG nova.compute.manager [req-20e24e1b-9cc3-4027-8796-cca80bb20ee2 req-ca49329a-5a89-433d-889b-823ad2b6a731 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] No waiting events found dispatching network-vif-plugged-6d0f8de4-2f58-4be9-973b-d7c01431f90c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 08:02:47 np0005546954 nova_compute[187160]: 2025-12-05 13:02:47.823 187164 WARNING nova.compute.manager [req-20e24e1b-9cc3-4027-8796-cca80bb20ee2 req-ca49329a-5a89-433d-889b-823ad2b6a731 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Received unexpected event network-vif-plugged-6d0f8de4-2f58-4be9-973b-d7c01431f90c for instance with vm_state active and task_state migrating.#033[00m
Dec  5 08:02:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:02:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:02:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:02:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:02:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:02:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:02:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:02:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:02:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:02:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:02:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:02:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:02:50 np0005546954 nova_compute[187160]: 2025-12-05 13:02:50.089 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:02:50 np0005546954 nova_compute[187160]: 2025-12-05 13:02:50.988 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:02:51 np0005546954 nova_compute[187160]: 2025-12-05 13:02:51.025 187164 DEBUG oslo_concurrency.lockutils [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Acquiring lock "4fe7206a-4625-4e1b-ad62-a53794dfe8f7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:02:51 np0005546954 nova_compute[187160]: 2025-12-05 13:02:51.026 187164 DEBUG oslo_concurrency.lockutils [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Lock "4fe7206a-4625-4e1b-ad62-a53794dfe8f7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:02:51 np0005546954 nova_compute[187160]: 2025-12-05 13:02:51.026 187164 DEBUG oslo_concurrency.lockutils [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Lock "4fe7206a-4625-4e1b-ad62-a53794dfe8f7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:02:51 np0005546954 nova_compute[187160]: 2025-12-05 13:02:51.050 187164 DEBUG oslo_concurrency.lockutils [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:02:51 np0005546954 nova_compute[187160]: 2025-12-05 13:02:51.051 187164 DEBUG oslo_concurrency.lockutils [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:02:51 np0005546954 nova_compute[187160]: 2025-12-05 13:02:51.051 187164 DEBUG oslo_concurrency.lockutils [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:02:51 np0005546954 nova_compute[187160]: 2025-12-05 13:02:51.052 187164 DEBUG nova.compute.resource_tracker [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 08:02:51 np0005546954 nova_compute[187160]: 2025-12-05 13:02:51.241 187164 WARNING nova.virt.libvirt.driver [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 08:02:51 np0005546954 nova_compute[187160]: 2025-12-05 13:02:51.242 187164 DEBUG nova.compute.resource_tracker [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5837MB free_disk=73.33565902709961GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 08:02:51 np0005546954 nova_compute[187160]: 2025-12-05 13:02:51.242 187164 DEBUG oslo_concurrency.lockutils [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:02:51 np0005546954 nova_compute[187160]: 2025-12-05 13:02:51.243 187164 DEBUG oslo_concurrency.lockutils [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:02:51 np0005546954 nova_compute[187160]: 2025-12-05 13:02:51.288 187164 DEBUG nova.compute.resource_tracker [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Migration for instance 4fe7206a-4625-4e1b-ad62-a53794dfe8f7 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Dec  5 08:02:51 np0005546954 nova_compute[187160]: 2025-12-05 13:02:51.309 187164 DEBUG nova.compute.resource_tracker [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Dec  5 08:02:51 np0005546954 nova_compute[187160]: 2025-12-05 13:02:51.340 187164 DEBUG nova.compute.resource_tracker [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Migration d14e2aac-c861-4e4a-bda0-5453927b7b70 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Dec  5 08:02:51 np0005546954 nova_compute[187160]: 2025-12-05 13:02:51.341 187164 DEBUG nova.compute.resource_tracker [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 08:02:51 np0005546954 nova_compute[187160]: 2025-12-05 13:02:51.341 187164 DEBUG nova.compute.resource_tracker [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 08:02:51 np0005546954 nova_compute[187160]: 2025-12-05 13:02:51.392 187164 DEBUG nova.compute.provider_tree [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 08:02:51 np0005546954 nova_compute[187160]: 2025-12-05 13:02:51.412 187164 DEBUG nova.scheduler.client.report [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 08:02:51 np0005546954 nova_compute[187160]: 2025-12-05 13:02:51.435 187164 DEBUG nova.compute.resource_tracker [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 08:02:51 np0005546954 nova_compute[187160]: 2025-12-05 13:02:51.436 187164 DEBUG oslo_concurrency.lockutils [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.193s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:02:51 np0005546954 nova_compute[187160]: 2025-12-05 13:02:51.441 187164 INFO nova.compute.manager [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Migrating instance to compute-0.ctlplane.example.com finished successfully.#033[00m
Dec  5 08:02:51 np0005546954 nova_compute[187160]: 2025-12-05 13:02:51.567 187164 INFO nova.scheduler.client.report [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Deleted allocation for migration d14e2aac-c861-4e4a-bda0-5453927b7b70#033[00m
Dec  5 08:02:51 np0005546954 nova_compute[187160]: 2025-12-05 13:02:51.567 187164 DEBUG nova.virt.libvirt.driver [None req-f6db7f29-d4dc-47bc-a4a0-357a466a1216 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Dec  5 08:02:55 np0005546954 nova_compute[187160]: 2025-12-05 13:02:55.093 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:02:55 np0005546954 ovn_controller[95566]: 2025-12-05T13:02:55Z|00214|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Dec  5 08:02:55 np0005546954 nova_compute[187160]: 2025-12-05 13:02:55.992 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:02:56 np0005546954 podman[216571]: 2025-12-05 13:02:56.577651965 +0000 UTC m=+0.080415331 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec  5 08:02:56 np0005546954 podman[216570]: 2025-12-05 13:02:56.585059224 +0000 UTC m=+0.083019010 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vcs-type=git, release=1755695350, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, container_name=openstack_network_exporter, io.openshift.expose-services=, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., config_id=edpm, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9)
Dec  5 08:02:58 np0005546954 nova_compute[187160]: 2025-12-05 13:02:58.046 187164 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764939763.0443509, 4fe7206a-4625-4e1b-ad62-a53794dfe8f7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 08:02:58 np0005546954 nova_compute[187160]: 2025-12-05 13:02:58.046 187164 INFO nova.compute.manager [-] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] VM Stopped (Lifecycle Event)#033[00m
Dec  5 08:02:58 np0005546954 nova_compute[187160]: 2025-12-05 13:02:58.070 187164 DEBUG nova.compute.manager [None req-48ad9589-9263-4158-817f-ad4a88cd4faf - - - - - -] [instance: 4fe7206a-4625-4e1b-ad62-a53794dfe8f7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 08:03:00 np0005546954 nova_compute[187160]: 2025-12-05 13:03:00.098 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:03:00 np0005546954 nova_compute[187160]: 2025-12-05 13:03:00.994 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:03:05 np0005546954 nova_compute[187160]: 2025-12-05 13:03:05.101 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:03:05 np0005546954 podman[197513]: time="2025-12-05T13:03:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:03:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:03:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 08:03:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:03:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2591 "" "Go-http-client/1.1"
Dec  5 08:03:05 np0005546954 nova_compute[187160]: 2025-12-05 13:03:05.995 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:03:08 np0005546954 nova_compute[187160]: 2025-12-05 13:03:08.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:03:08 np0005546954 nova_compute[187160]: 2025-12-05 13:03:08.041 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 08:03:08 np0005546954 nova_compute[187160]: 2025-12-05 13:03:08.041 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 08:03:09 np0005546954 nova_compute[187160]: 2025-12-05 13:03:09.345 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 08:03:09 np0005546954 nova_compute[187160]: 2025-12-05 13:03:09.345 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:03:10 np0005546954 nova_compute[187160]: 2025-12-05 13:03:10.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:03:10 np0005546954 nova_compute[187160]: 2025-12-05 13:03:10.103 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:03:10 np0005546954 podman[216613]: 2025-12-05 13:03:10.593398614 +0000 UTC m=+0.091705039 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec  5 08:03:10 np0005546954 nova_compute[187160]: 2025-12-05 13:03:10.997 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:03:14 np0005546954 nova_compute[187160]: 2025-12-05 13:03:14.035 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:03:14 np0005546954 nova_compute[187160]: 2025-12-05 13:03:14.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:03:14 np0005546954 nova_compute[187160]: 2025-12-05 13:03:14.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:03:15 np0005546954 nova_compute[187160]: 2025-12-05 13:03:15.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:03:15 np0005546954 nova_compute[187160]: 2025-12-05 13:03:15.082 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:03:15 np0005546954 nova_compute[187160]: 2025-12-05 13:03:15.083 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:03:15 np0005546954 nova_compute[187160]: 2025-12-05 13:03:15.083 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:03:15 np0005546954 nova_compute[187160]: 2025-12-05 13:03:15.083 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 08:03:15 np0005546954 nova_compute[187160]: 2025-12-05 13:03:15.106 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:03:15 np0005546954 nova_compute[187160]: 2025-12-05 13:03:15.223 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 08:03:15 np0005546954 nova_compute[187160]: 2025-12-05 13:03:15.224 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5874MB free_disk=73.33552169799805GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 08:03:15 np0005546954 nova_compute[187160]: 2025-12-05 13:03:15.225 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:03:15 np0005546954 nova_compute[187160]: 2025-12-05 13:03:15.225 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:03:15 np0005546954 nova_compute[187160]: 2025-12-05 13:03:15.276 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 08:03:15 np0005546954 nova_compute[187160]: 2025-12-05 13:03:15.277 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 08:03:15 np0005546954 nova_compute[187160]: 2025-12-05 13:03:15.298 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 08:03:15 np0005546954 nova_compute[187160]: 2025-12-05 13:03:15.310 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 08:03:15 np0005546954 nova_compute[187160]: 2025-12-05 13:03:15.311 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 08:03:15 np0005546954 nova_compute[187160]: 2025-12-05 13:03:15.312 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.087s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:03:15 np0005546954 podman[216633]: 2025-12-05 13:03:15.567031457 +0000 UTC m=+0.067615093 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 08:03:15 np0005546954 podman[216632]: 2025-12-05 13:03:15.587077617 +0000 UTC m=+0.102656298 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  5 08:03:16 np0005546954 nova_compute[187160]: 2025-12-05 13:03:16.053 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:03:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:03:16.966 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:03:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:03:16.966 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:03:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:03:16.966 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:03:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:03:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:03:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:03:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:03:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:03:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:03:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:03:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:03:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:03:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:03:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:03:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:03:20 np0005546954 nova_compute[187160]: 2025-12-05 13:03:20.107 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:03:20 np0005546954 nova_compute[187160]: 2025-12-05 13:03:20.305 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:03:20 np0005546954 nova_compute[187160]: 2025-12-05 13:03:20.750 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:03:20 np0005546954 nova_compute[187160]: 2025-12-05 13:03:20.750 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 08:03:21 np0005546954 nova_compute[187160]: 2025-12-05 13:03:21.055 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:03:21 np0005546954 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec  5 08:03:22 np0005546954 nova_compute[187160]: 2025-12-05 13:03:22.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:03:25 np0005546954 nova_compute[187160]: 2025-12-05 13:03:25.110 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:03:26 np0005546954 nova_compute[187160]: 2025-12-05 13:03:26.058 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:03:27 np0005546954 podman[216683]: 2025-12-05 13:03:27.544659646 +0000 UTC m=+0.059575254 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, managed_by=edpm_ansible, vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, name=ubi9-minimal, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, config_id=edpm)
Dec  5 08:03:27 np0005546954 podman[216684]: 2025-12-05 13:03:27.60487663 +0000 UTC m=+0.107477998 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec  5 08:03:29 np0005546954 ovn_controller[95566]: 2025-12-05T13:03:29Z|00215|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Dec  5 08:03:30 np0005546954 nova_compute[187160]: 2025-12-05 13:03:30.113 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:03:31 np0005546954 nova_compute[187160]: 2025-12-05 13:03:31.060 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:03:35 np0005546954 nova_compute[187160]: 2025-12-05 13:03:35.117 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:03:35 np0005546954 podman[197513]: time="2025-12-05T13:03:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:03:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:03:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 08:03:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:03:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2593 "" "Go-http-client/1.1"
Dec  5 08:03:36 np0005546954 nova_compute[187160]: 2025-12-05 13:03:36.062 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:03:40 np0005546954 nova_compute[187160]: 2025-12-05 13:03:40.120 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:03:41 np0005546954 nova_compute[187160]: 2025-12-05 13:03:41.064 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:03:41 np0005546954 podman[216724]: 2025-12-05 13:03:41.561154009 +0000 UTC m=+0.066801829 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec  5 08:03:45 np0005546954 nova_compute[187160]: 2025-12-05 13:03:45.121 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:03:46 np0005546954 nova_compute[187160]: 2025-12-05 13:03:46.066 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:03:46 np0005546954 podman[216747]: 2025-12-05 13:03:46.588312469 +0000 UTC m=+0.077716086 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 08:03:46 np0005546954 podman[216746]: 2025-12-05 13:03:46.59770227 +0000 UTC m=+0.101485142 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  5 08:03:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:03:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:03:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:03:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:03:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:03:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:03:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:03:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:03:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:03:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:03:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:03:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:03:50 np0005546954 nova_compute[187160]: 2025-12-05 13:03:50.125 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:03:51 np0005546954 nova_compute[187160]: 2025-12-05 13:03:51.068 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:03:55 np0005546954 nova_compute[187160]: 2025-12-05 13:03:55.128 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:03:56 np0005546954 nova_compute[187160]: 2025-12-05 13:03:56.071 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:03:58 np0005546954 podman[216794]: 2025-12-05 13:03:58.535952689 +0000 UTC m=+0.052397632 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_id=edpm, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, release=1755695350, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec  5 08:03:58 np0005546954 podman[216795]: 2025-12-05 13:03:58.540617684 +0000 UTC m=+0.052662301 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec  5 08:04:00 np0005546954 nova_compute[187160]: 2025-12-05 13:04:00.129 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:04:01 np0005546954 nova_compute[187160]: 2025-12-05 13:04:01.072 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:04:05 np0005546954 nova_compute[187160]: 2025-12-05 13:04:05.132 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:04:05 np0005546954 podman[197513]: time="2025-12-05T13:04:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:04:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:04:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 08:04:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:04:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2597 "" "Go-http-client/1.1"
Dec  5 08:04:06 np0005546954 nova_compute[187160]: 2025-12-05 13:04:06.075 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:04:09 np0005546954 nova_compute[187160]: 2025-12-05 13:04:09.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:04:09 np0005546954 nova_compute[187160]: 2025-12-05 13:04:09.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 08:04:09 np0005546954 nova_compute[187160]: 2025-12-05 13:04:09.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 08:04:09 np0005546954 nova_compute[187160]: 2025-12-05 13:04:09.083 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 08:04:10 np0005546954 nova_compute[187160]: 2025-12-05 13:04:10.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:04:10 np0005546954 nova_compute[187160]: 2025-12-05 13:04:10.133 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:04:11 np0005546954 nova_compute[187160]: 2025-12-05 13:04:11.075 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:04:11 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:04:11.320 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2a:56:46', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '32:90:88:ab:74:32'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 08:04:11 np0005546954 nova_compute[187160]: 2025-12-05 13:04:11.321 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:04:11 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:04:11.323 104428 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 08:04:12 np0005546954 nova_compute[187160]: 2025-12-05 13:04:12.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:04:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:04:12.327 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f9f74c-08f9-451f-9678-93bb9e8fa2fe, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:04:12 np0005546954 podman[216836]: 2025-12-05 13:04:12.598964462 +0000 UTC m=+0.105396283 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec  5 08:04:15 np0005546954 nova_compute[187160]: 2025-12-05 13:04:15.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:04:15 np0005546954 nova_compute[187160]: 2025-12-05 13:04:15.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:04:15 np0005546954 nova_compute[187160]: 2025-12-05 13:04:15.136 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:04:16 np0005546954 nova_compute[187160]: 2025-12-05 13:04:16.035 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:04:16 np0005546954 nova_compute[187160]: 2025-12-05 13:04:16.077 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:04:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:04:16.967 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:04:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:04:16.968 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:04:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:04:16.968 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:04:17 np0005546954 nova_compute[187160]: 2025-12-05 13:04:17.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:04:17 np0005546954 nova_compute[187160]: 2025-12-05 13:04:17.072 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:04:17 np0005546954 nova_compute[187160]: 2025-12-05 13:04:17.072 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:04:17 np0005546954 nova_compute[187160]: 2025-12-05 13:04:17.073 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:04:17 np0005546954 nova_compute[187160]: 2025-12-05 13:04:17.073 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 08:04:17 np0005546954 nova_compute[187160]: 2025-12-05 13:04:17.288 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 08:04:17 np0005546954 nova_compute[187160]: 2025-12-05 13:04:17.290 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5876MB free_disk=73.32987594604492GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 08:04:17 np0005546954 nova_compute[187160]: 2025-12-05 13:04:17.290 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:04:17 np0005546954 nova_compute[187160]: 2025-12-05 13:04:17.290 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:04:17 np0005546954 nova_compute[187160]: 2025-12-05 13:04:17.359 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 08:04:17 np0005546954 nova_compute[187160]: 2025-12-05 13:04:17.359 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 08:04:17 np0005546954 nova_compute[187160]: 2025-12-05 13:04:17.380 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Refreshing inventories for resource provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  5 08:04:17 np0005546954 nova_compute[187160]: 2025-12-05 13:04:17.409 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Updating ProviderTree inventory for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  5 08:04:17 np0005546954 nova_compute[187160]: 2025-12-05 13:04:17.410 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Updating inventory in ProviderTree for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  5 08:04:17 np0005546954 nova_compute[187160]: 2025-12-05 13:04:17.423 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Refreshing aggregate associations for resource provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  5 08:04:17 np0005546954 nova_compute[187160]: 2025-12-05 13:04:17.448 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Refreshing trait associations for resource provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b, traits: COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_IDE,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_2_0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  5 08:04:17 np0005546954 nova_compute[187160]: 2025-12-05 13:04:17.482 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 08:04:17 np0005546954 nova_compute[187160]: 2025-12-05 13:04:17.509 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 08:04:17 np0005546954 nova_compute[187160]: 2025-12-05 13:04:17.511 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 08:04:17 np0005546954 nova_compute[187160]: 2025-12-05 13:04:17.512 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.222s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:04:17 np0005546954 podman[216856]: 2025-12-05 13:04:17.586392663 +0000 UTC m=+0.082224396 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  5 08:04:17 np0005546954 podman[216855]: 2025-12-05 13:04:17.639552118 +0000 UTC m=+0.138760585 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_controller)
Dec  5 08:04:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:04:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:04:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:04:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:04:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:04:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:04:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:04:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:04:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:04:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:04:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:04:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:04:20 np0005546954 nova_compute[187160]: 2025-12-05 13:04:20.138 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:04:20 np0005546954 nova_compute[187160]: 2025-12-05 13:04:20.512 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:04:20 np0005546954 nova_compute[187160]: 2025-12-05 13:04:20.513 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 08:04:21 np0005546954 nova_compute[187160]: 2025-12-05 13:04:21.079 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:04:23 np0005546954 nova_compute[187160]: 2025-12-05 13:04:23.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:04:25 np0005546954 nova_compute[187160]: 2025-12-05 13:04:25.141 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:04:26 np0005546954 nova_compute[187160]: 2025-12-05 13:04:26.082 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:04:26 np0005546954 nova_compute[187160]: 2025-12-05 13:04:26.372 187164 DEBUG oslo_concurrency.lockutils [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "76812e6f-bda5-495b-be99-2ff8c5960729" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:04:26 np0005546954 nova_compute[187160]: 2025-12-05 13:04:26.372 187164 DEBUG oslo_concurrency.lockutils [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "76812e6f-bda5-495b-be99-2ff8c5960729" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:04:26 np0005546954 nova_compute[187160]: 2025-12-05 13:04:26.535 187164 DEBUG nova.compute.manager [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 08:04:28 np0005546954 nova_compute[187160]: 2025-12-05 13:04:28.359 187164 DEBUG oslo_concurrency.lockutils [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:04:28 np0005546954 nova_compute[187160]: 2025-12-05 13:04:28.359 187164 DEBUG oslo_concurrency.lockutils [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:04:28 np0005546954 nova_compute[187160]: 2025-12-05 13:04:28.366 187164 DEBUG nova.virt.hardware [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 08:04:28 np0005546954 nova_compute[187160]: 2025-12-05 13:04:28.366 187164 INFO nova.compute.claims [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Claim successful on node compute-1.ctlplane.example.com#033[00m
Dec  5 08:04:29 np0005546954 podman[216903]: 2025-12-05 13:04:29.573053392 +0000 UTC m=+0.082716341 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, vcs-type=git, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, distribution-scope=public, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, config_id=edpm, container_name=openstack_network_exporter, managed_by=edpm_ansible, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec  5 08:04:29 np0005546954 podman[216904]: 2025-12-05 13:04:29.57653715 +0000 UTC m=+0.071166734 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 08:04:30 np0005546954 nova_compute[187160]: 2025-12-05 13:04:30.143 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:04:30 np0005546954 nova_compute[187160]: 2025-12-05 13:04:30.749 187164 DEBUG nova.compute.provider_tree [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 08:04:30 np0005546954 nova_compute[187160]: 2025-12-05 13:04:30.768 187164 DEBUG nova.scheduler.client.report [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 08:04:30 np0005546954 nova_compute[187160]: 2025-12-05 13:04:30.797 187164 DEBUG oslo_concurrency.lockutils [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.438s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:04:30 np0005546954 nova_compute[187160]: 2025-12-05 13:04:30.799 187164 DEBUG nova.compute.manager [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 08:04:30 np0005546954 nova_compute[187160]: 2025-12-05 13:04:30.872 187164 DEBUG nova.compute.manager [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 08:04:30 np0005546954 nova_compute[187160]: 2025-12-05 13:04:30.873 187164 DEBUG nova.network.neutron [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 08:04:30 np0005546954 nova_compute[187160]: 2025-12-05 13:04:30.904 187164 INFO nova.virt.libvirt.driver [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 08:04:31 np0005546954 nova_compute[187160]: 2025-12-05 13:04:31.083 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:04:31 np0005546954 nova_compute[187160]: 2025-12-05 13:04:31.153 187164 DEBUG nova.policy [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0ae0bb20ac8b4be99eb1abddc7310436', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e6ae0d0dcde04b85b6dae45560cca988', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 08:04:31 np0005546954 nova_compute[187160]: 2025-12-05 13:04:31.321 187164 DEBUG nova.compute.manager [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 08:04:31 np0005546954 nova_compute[187160]: 2025-12-05 13:04:31.637 187164 DEBUG nova.compute.manager [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 08:04:31 np0005546954 nova_compute[187160]: 2025-12-05 13:04:31.638 187164 DEBUG nova.virt.libvirt.driver [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 08:04:31 np0005546954 nova_compute[187160]: 2025-12-05 13:04:31.638 187164 INFO nova.virt.libvirt.driver [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Creating image(s)#033[00m
Dec  5 08:04:31 np0005546954 nova_compute[187160]: 2025-12-05 13:04:31.639 187164 DEBUG oslo_concurrency.lockutils [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "/var/lib/nova/instances/76812e6f-bda5-495b-be99-2ff8c5960729/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:04:31 np0005546954 nova_compute[187160]: 2025-12-05 13:04:31.639 187164 DEBUG oslo_concurrency.lockutils [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "/var/lib/nova/instances/76812e6f-bda5-495b-be99-2ff8c5960729/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:04:31 np0005546954 nova_compute[187160]: 2025-12-05 13:04:31.640 187164 DEBUG oslo_concurrency.lockutils [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "/var/lib/nova/instances/76812e6f-bda5-495b-be99-2ff8c5960729/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:04:31 np0005546954 nova_compute[187160]: 2025-12-05 13:04:31.651 187164 DEBUG oslo_concurrency.processutils [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:04:31 np0005546954 nova_compute[187160]: 2025-12-05 13:04:31.738 187164 DEBUG oslo_concurrency.processutils [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:04:31 np0005546954 nova_compute[187160]: 2025-12-05 13:04:31.740 187164 DEBUG oslo_concurrency.lockutils [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:04:31 np0005546954 nova_compute[187160]: 2025-12-05 13:04:31.741 187164 DEBUG oslo_concurrency.lockutils [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:04:31 np0005546954 nova_compute[187160]: 2025-12-05 13:04:31.763 187164 DEBUG oslo_concurrency.processutils [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:04:31 np0005546954 nova_compute[187160]: 2025-12-05 13:04:31.837 187164 DEBUG oslo_concurrency.processutils [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:04:31 np0005546954 nova_compute[187160]: 2025-12-05 13:04:31.839 187164 DEBUG oslo_concurrency.processutils [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/76812e6f-bda5-495b-be99-2ff8c5960729/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:04:31 np0005546954 nova_compute[187160]: 2025-12-05 13:04:31.875 187164 DEBUG oslo_concurrency.processutils [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/76812e6f-bda5-495b-be99-2ff8c5960729/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:04:31 np0005546954 nova_compute[187160]: 2025-12-05 13:04:31.877 187164 DEBUG oslo_concurrency.lockutils [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.136s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:04:31 np0005546954 nova_compute[187160]: 2025-12-05 13:04:31.878 187164 DEBUG oslo_concurrency.processutils [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:04:31 np0005546954 nova_compute[187160]: 2025-12-05 13:04:31.932 187164 DEBUG oslo_concurrency.processutils [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:04:31 np0005546954 nova_compute[187160]: 2025-12-05 13:04:31.934 187164 DEBUG nova.virt.disk.api [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Checking if we can resize image /var/lib/nova/instances/76812e6f-bda5-495b-be99-2ff8c5960729/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 08:04:31 np0005546954 nova_compute[187160]: 2025-12-05 13:04:31.935 187164 DEBUG oslo_concurrency.processutils [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/76812e6f-bda5-495b-be99-2ff8c5960729/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:04:31 np0005546954 nova_compute[187160]: 2025-12-05 13:04:31.996 187164 DEBUG oslo_concurrency.processutils [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/76812e6f-bda5-495b-be99-2ff8c5960729/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:04:31 np0005546954 nova_compute[187160]: 2025-12-05 13:04:31.998 187164 DEBUG nova.virt.disk.api [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Cannot resize image /var/lib/nova/instances/76812e6f-bda5-495b-be99-2ff8c5960729/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 08:04:31 np0005546954 nova_compute[187160]: 2025-12-05 13:04:31.999 187164 DEBUG nova.objects.instance [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lazy-loading 'migration_context' on Instance uuid 76812e6f-bda5-495b-be99-2ff8c5960729 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 08:04:32 np0005546954 nova_compute[187160]: 2025-12-05 13:04:32.024 187164 DEBUG nova.virt.libvirt.driver [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 08:04:32 np0005546954 nova_compute[187160]: 2025-12-05 13:04:32.025 187164 DEBUG nova.virt.libvirt.driver [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Ensure instance console log exists: /var/lib/nova/instances/76812e6f-bda5-495b-be99-2ff8c5960729/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 08:04:32 np0005546954 nova_compute[187160]: 2025-12-05 13:04:32.026 187164 DEBUG oslo_concurrency.lockutils [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:04:32 np0005546954 nova_compute[187160]: 2025-12-05 13:04:32.027 187164 DEBUG oslo_concurrency.lockutils [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:04:32 np0005546954 nova_compute[187160]: 2025-12-05 13:04:32.028 187164 DEBUG oslo_concurrency.lockutils [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:04:32 np0005546954 nova_compute[187160]: 2025-12-05 13:04:32.092 187164 DEBUG nova.network.neutron [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Successfully created port: c0abe331-dd3a-4aee-9aaa-7f89d13ef185 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 08:04:32 np0005546954 nova_compute[187160]: 2025-12-05 13:04:32.927 187164 DEBUG nova.network.neutron [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Successfully updated port: c0abe331-dd3a-4aee-9aaa-7f89d13ef185 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 08:04:32 np0005546954 nova_compute[187160]: 2025-12-05 13:04:32.945 187164 DEBUG oslo_concurrency.lockutils [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "refresh_cache-76812e6f-bda5-495b-be99-2ff8c5960729" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 08:04:32 np0005546954 nova_compute[187160]: 2025-12-05 13:04:32.946 187164 DEBUG oslo_concurrency.lockutils [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquired lock "refresh_cache-76812e6f-bda5-495b-be99-2ff8c5960729" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 08:04:32 np0005546954 nova_compute[187160]: 2025-12-05 13:04:32.946 187164 DEBUG nova.network.neutron [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 08:04:33 np0005546954 nova_compute[187160]: 2025-12-05 13:04:33.007 187164 DEBUG nova.compute.manager [req-47b1fcce-fc8c-42ae-ad41-5589d02a5f2d req-ef60b2cc-354e-447c-ba00-ab142b685fcf 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Received event network-changed-c0abe331-dd3a-4aee-9aaa-7f89d13ef185 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:04:33 np0005546954 nova_compute[187160]: 2025-12-05 13:04:33.007 187164 DEBUG nova.compute.manager [req-47b1fcce-fc8c-42ae-ad41-5589d02a5f2d req-ef60b2cc-354e-447c-ba00-ab142b685fcf 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Refreshing instance network info cache due to event network-changed-c0abe331-dd3a-4aee-9aaa-7f89d13ef185. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 08:04:33 np0005546954 nova_compute[187160]: 2025-12-05 13:04:33.008 187164 DEBUG oslo_concurrency.lockutils [req-47b1fcce-fc8c-42ae-ad41-5589d02a5f2d req-ef60b2cc-354e-447c-ba00-ab142b685fcf 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "refresh_cache-76812e6f-bda5-495b-be99-2ff8c5960729" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 08:04:33 np0005546954 nova_compute[187160]: 2025-12-05 13:04:33.137 187164 DEBUG nova.network.neutron [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 08:04:33 np0005546954 nova_compute[187160]: 2025-12-05 13:04:33.941 187164 DEBUG nova.network.neutron [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Updating instance_info_cache with network_info: [{"id": "c0abe331-dd3a-4aee-9aaa-7f89d13ef185", "address": "fa:16:3e:a2:39:ba", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0abe331-dd", "ovs_interfaceid": "c0abe331-dd3a-4aee-9aaa-7f89d13ef185", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.125 187164 DEBUG oslo_concurrency.lockutils [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Releasing lock "refresh_cache-76812e6f-bda5-495b-be99-2ff8c5960729" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.126 187164 DEBUG nova.compute.manager [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Instance network_info: |[{"id": "c0abe331-dd3a-4aee-9aaa-7f89d13ef185", "address": "fa:16:3e:a2:39:ba", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0abe331-dd", "ovs_interfaceid": "c0abe331-dd3a-4aee-9aaa-7f89d13ef185", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.127 187164 DEBUG oslo_concurrency.lockutils [req-47b1fcce-fc8c-42ae-ad41-5589d02a5f2d req-ef60b2cc-354e-447c-ba00-ab142b685fcf 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquired lock "refresh_cache-76812e6f-bda5-495b-be99-2ff8c5960729" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.128 187164 DEBUG nova.network.neutron [req-47b1fcce-fc8c-42ae-ad41-5589d02a5f2d req-ef60b2cc-354e-447c-ba00-ab142b685fcf 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Refreshing network info cache for port c0abe331-dd3a-4aee-9aaa-7f89d13ef185 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.134 187164 DEBUG nova.virt.libvirt.driver [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Start _get_guest_xml network_info=[{"id": "c0abe331-dd3a-4aee-9aaa-7f89d13ef185", "address": "fa:16:3e:a2:39:ba", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0abe331-dd", "ovs_interfaceid": "c0abe331-dd3a-4aee-9aaa-7f89d13ef185", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T12:39:17Z,direct_url=<?>,disk_format='qcow2',id=f4c3125a-6fd0-40bb-aa00-a7e736ee853d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='83916c53de6f404f91206339303e1b23',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T12:39:21Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'size': 0, 'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'encrypted': False, 'image_id': 'f4c3125a-6fd0-40bb-aa00-a7e736ee853d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.141 187164 WARNING nova.virt.libvirt.driver [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.146 187164 DEBUG nova.virt.libvirt.host [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.147 187164 DEBUG nova.virt.libvirt.host [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.151 187164 DEBUG nova.virt.libvirt.host [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.151 187164 DEBUG nova.virt.libvirt.host [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.153 187164 DEBUG nova.virt.libvirt.driver [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.154 187164 DEBUG nova.virt.hardware [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T12:39:15Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b4ea63be-97f8-4a48-b000-66321c4ddb27',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T12:39:17Z,direct_url=<?>,disk_format='qcow2',id=f4c3125a-6fd0-40bb-aa00-a7e736ee853d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='83916c53de6f404f91206339303e1b23',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T12:39:21Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.154 187164 DEBUG nova.virt.hardware [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.155 187164 DEBUG nova.virt.hardware [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.155 187164 DEBUG nova.virt.hardware [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.155 187164 DEBUG nova.virt.hardware [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.156 187164 DEBUG nova.virt.hardware [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.156 187164 DEBUG nova.virt.hardware [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.156 187164 DEBUG nova.virt.hardware [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.157 187164 DEBUG nova.virt.hardware [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.157 187164 DEBUG nova.virt.hardware [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.157 187164 DEBUG nova.virt.hardware [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.162 187164 DEBUG nova.virt.libvirt.vif [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T13:04:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-778787294',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-778787294',id=24,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e6ae0d0dcde04b85b6dae45560cca988',ramdisk_id='',reservation_id='r-sbkc9t88',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-192029678',owner_user_name='tempest-TestExecuteStrategies-192029678-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T13:04:31Z,user_data=None,user_id='0ae0bb20ac8b4be99eb1abddc7310436',uuid=76812e6f-bda5-495b-be99-2ff8c5960729,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c0abe331-dd3a-4aee-9aaa-7f89d13ef185", "address": "fa:16:3e:a2:39:ba", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0abe331-dd", "ovs_interfaceid": "c0abe331-dd3a-4aee-9aaa-7f89d13ef185", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.163 187164 DEBUG nova.network.os_vif_util [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converting VIF {"id": "c0abe331-dd3a-4aee-9aaa-7f89d13ef185", "address": "fa:16:3e:a2:39:ba", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0abe331-dd", "ovs_interfaceid": "c0abe331-dd3a-4aee-9aaa-7f89d13ef185", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.164 187164 DEBUG nova.network.os_vif_util [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a2:39:ba,bridge_name='br-int',has_traffic_filtering=True,id=c0abe331-dd3a-4aee-9aaa-7f89d13ef185,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0abe331-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.165 187164 DEBUG nova.objects.instance [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lazy-loading 'pci_devices' on Instance uuid 76812e6f-bda5-495b-be99-2ff8c5960729 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.186 187164 DEBUG nova.virt.libvirt.driver [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] End _get_guest_xml xml=<domain type="kvm">
Dec  5 08:04:34 np0005546954 nova_compute[187160]:  <uuid>76812e6f-bda5-495b-be99-2ff8c5960729</uuid>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:  <name>instance-00000018</name>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:  <memory>131072</memory>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:  <vcpu>1</vcpu>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:  <metadata>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 08:04:34 np0005546954 nova_compute[187160]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:      <nova:name>tempest-TestExecuteStrategies-server-778787294</nova:name>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:      <nova:creationTime>2025-12-05 13:04:34</nova:creationTime>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:      <nova:flavor name="m1.nano">
Dec  5 08:04:34 np0005546954 nova_compute[187160]:        <nova:memory>128</nova:memory>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:        <nova:disk>1</nova:disk>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:        <nova:swap>0</nova:swap>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:        <nova:vcpus>1</nova:vcpus>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:      </nova:flavor>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:      <nova:owner>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:        <nova:user uuid="0ae0bb20ac8b4be99eb1abddc7310436">tempest-TestExecuteStrategies-192029678-project-member</nova:user>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:        <nova:project uuid="e6ae0d0dcde04b85b6dae45560cca988">tempest-TestExecuteStrategies-192029678</nova:project>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:      </nova:owner>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:      <nova:root type="image" uuid="f4c3125a-6fd0-40bb-aa00-a7e736ee853d"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:      <nova:ports>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:        <nova:port uuid="c0abe331-dd3a-4aee-9aaa-7f89d13ef185">
Dec  5 08:04:34 np0005546954 nova_compute[187160]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:        </nova:port>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:      </nova:ports>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    </nova:instance>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:  </metadata>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:  <sysinfo type="smbios">
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <system>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:      <entry name="manufacturer">RDO</entry>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:      <entry name="product">OpenStack Compute</entry>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:      <entry name="serial">76812e6f-bda5-495b-be99-2ff8c5960729</entry>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:      <entry name="uuid">76812e6f-bda5-495b-be99-2ff8c5960729</entry>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:      <entry name="family">Virtual Machine</entry>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    </system>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:  </sysinfo>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:  <os>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <boot dev="hd"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <smbios mode="sysinfo"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:  </os>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:  <features>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <acpi/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <apic/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <vmcoreinfo/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:  </features>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:  <clock offset="utc">
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <timer name="hpet" present="no"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:  </clock>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:  <cpu mode="custom" match="exact">
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <model>Nehalem</model>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:  </cpu>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:  <devices>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <disk type="file" device="disk">
Dec  5 08:04:34 np0005546954 nova_compute[187160]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:      <source file="/var/lib/nova/instances/76812e6f-bda5-495b-be99-2ff8c5960729/disk"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:      <target dev="vda" bus="virtio"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    </disk>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <disk type="file" device="cdrom">
Dec  5 08:04:34 np0005546954 nova_compute[187160]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:      <source file="/var/lib/nova/instances/76812e6f-bda5-495b-be99-2ff8c5960729/disk.config"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:      <target dev="sda" bus="sata"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    </disk>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <interface type="ethernet">
Dec  5 08:04:34 np0005546954 nova_compute[187160]:      <mac address="fa:16:3e:a2:39:ba"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:      <model type="virtio"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:      <mtu size="1442"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:      <target dev="tapc0abe331-dd"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    </interface>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <serial type="pty">
Dec  5 08:04:34 np0005546954 nova_compute[187160]:      <log file="/var/lib/nova/instances/76812e6f-bda5-495b-be99-2ff8c5960729/console.log" append="off"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    </serial>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <video>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:      <model type="virtio"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    </video>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <input type="tablet" bus="usb"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <rng model="virtio">
Dec  5 08:04:34 np0005546954 nova_compute[187160]:      <backend model="random">/dev/urandom</backend>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    </rng>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <controller type="usb" index="0"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    <memballoon model="virtio">
Dec  5 08:04:34 np0005546954 nova_compute[187160]:      <stats period="10"/>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:    </memballoon>
Dec  5 08:04:34 np0005546954 nova_compute[187160]:  </devices>
Dec  5 08:04:34 np0005546954 nova_compute[187160]: </domain>
Dec  5 08:04:34 np0005546954 nova_compute[187160]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.188 187164 DEBUG nova.compute.manager [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Preparing to wait for external event network-vif-plugged-c0abe331-dd3a-4aee-9aaa-7f89d13ef185 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.188 187164 DEBUG oslo_concurrency.lockutils [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "76812e6f-bda5-495b-be99-2ff8c5960729-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.188 187164 DEBUG oslo_concurrency.lockutils [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "76812e6f-bda5-495b-be99-2ff8c5960729-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.189 187164 DEBUG oslo_concurrency.lockutils [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "76812e6f-bda5-495b-be99-2ff8c5960729-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.190 187164 DEBUG nova.virt.libvirt.vif [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T13:04:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-778787294',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-778787294',id=24,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e6ae0d0dcde04b85b6dae45560cca988',ramdisk_id='',reservation_id='r-sbkc9t88',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-192029678',owner_user_name='tempest-TestExecuteStrategies-192029678-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T13:04:31Z,user_data=None,user_id='0ae0bb20ac8b4be99eb1abddc7310436',uuid=76812e6f-bda5-495b-be99-2ff8c5960729,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c0abe331-dd3a-4aee-9aaa-7f89d13ef185", "address": "fa:16:3e:a2:39:ba", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0abe331-dd", "ovs_interfaceid": "c0abe331-dd3a-4aee-9aaa-7f89d13ef185", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.190 187164 DEBUG nova.network.os_vif_util [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converting VIF {"id": "c0abe331-dd3a-4aee-9aaa-7f89d13ef185", "address": "fa:16:3e:a2:39:ba", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0abe331-dd", "ovs_interfaceid": "c0abe331-dd3a-4aee-9aaa-7f89d13ef185", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.191 187164 DEBUG nova.network.os_vif_util [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a2:39:ba,bridge_name='br-int',has_traffic_filtering=True,id=c0abe331-dd3a-4aee-9aaa-7f89d13ef185,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0abe331-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.191 187164 DEBUG os_vif [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a2:39:ba,bridge_name='br-int',has_traffic_filtering=True,id=c0abe331-dd3a-4aee-9aaa-7f89d13ef185,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0abe331-dd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.192 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.193 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.193 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.196 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.197 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc0abe331-dd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.197 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc0abe331-dd, col_values=(('external_ids', {'iface-id': 'c0abe331-dd3a-4aee-9aaa-7f89d13ef185', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a2:39:ba', 'vm-uuid': '76812e6f-bda5-495b-be99-2ff8c5960729'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.199 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:04:34 np0005546954 NetworkManager[55665]: <info>  [1764939874.2001] manager: (tapc0abe331-dd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/85)
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.201 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.206 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.207 187164 INFO os_vif [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a2:39:ba,bridge_name='br-int',has_traffic_filtering=True,id=c0abe331-dd3a-4aee-9aaa-7f89d13ef185,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0abe331-dd')#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.270 187164 DEBUG nova.virt.libvirt.driver [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.270 187164 DEBUG nova.virt.libvirt.driver [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.270 187164 DEBUG nova.virt.libvirt.driver [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] No VIF found with MAC fa:16:3e:a2:39:ba, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.271 187164 INFO nova.virt.libvirt.driver [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Using config drive#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.612 187164 INFO nova.virt.libvirt.driver [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Creating config drive at /var/lib/nova/instances/76812e6f-bda5-495b-be99-2ff8c5960729/disk.config#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.622 187164 DEBUG oslo_concurrency.processutils [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/76812e6f-bda5-495b-be99-2ff8c5960729/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpp6e4cigo execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.752 187164 DEBUG oslo_concurrency.processutils [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/76812e6f-bda5-495b-be99-2ff8c5960729/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpp6e4cigo" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:04:34 np0005546954 kernel: tapc0abe331-dd: entered promiscuous mode
Dec  5 08:04:34 np0005546954 NetworkManager[55665]: <info>  [1764939874.8149] manager: (tapc0abe331-dd): new Tun device (/org/freedesktop/NetworkManager/Devices/86)
Dec  5 08:04:34 np0005546954 ovn_controller[95566]: 2025-12-05T13:04:34Z|00216|binding|INFO|Claiming lport c0abe331-dd3a-4aee-9aaa-7f89d13ef185 for this chassis.
Dec  5 08:04:34 np0005546954 ovn_controller[95566]: 2025-12-05T13:04:34Z|00217|binding|INFO|c0abe331-dd3a-4aee-9aaa-7f89d13ef185: Claiming fa:16:3e:a2:39:ba 10.100.0.14
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.815 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:04:34 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:04:34.846 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a2:39:ba 10.100.0.14'], port_security=['fa:16:3e:a2:39:ba 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '76812e6f-bda5-495b-be99-2ff8c5960729', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4389bc8-2898-48b0-9741-5183b54fe83c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e6ae0d0dcde04b85b6dae45560cca988', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9ea68f98-ae7c-4c35-bc5a-7c1a27f7e5f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb60c317-acba-4c06-b29b-f7c6c7a5660a, chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=c0abe331-dd3a-4aee-9aaa-7f89d13ef185) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 08:04:34 np0005546954 ovn_controller[95566]: 2025-12-05T13:04:34Z|00218|binding|INFO|Setting lport c0abe331-dd3a-4aee-9aaa-7f89d13ef185 up in Southbound
Dec  5 08:04:34 np0005546954 ovn_controller[95566]: 2025-12-05T13:04:34Z|00219|binding|INFO|Setting lport c0abe331-dd3a-4aee-9aaa-7f89d13ef185 ovn-installed in OVS
Dec  5 08:04:34 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:04:34.848 104428 INFO neutron.agent.ovn.metadata.agent [-] Port c0abe331-dd3a-4aee-9aaa-7f89d13ef185 in datapath d4389bc8-2898-48b0-9741-5183b54fe83c bound to our chassis#033[00m
Dec  5 08:04:34 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:04:34.849 104428 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d4389bc8-2898-48b0-9741-5183b54fe83c#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.849 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:04:34 np0005546954 nova_compute[187160]: 2025-12-05 13:04:34.858 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:04:34 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:04:34.860 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[d34a50ab-fe97-420d-b47d-8637b0c20f7d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:04:34 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:04:34.862 104428 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd4389bc8-21 in ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 08:04:34 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:04:34.864 208690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd4389bc8-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 08:04:34 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:04:34.864 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[c31b07bb-54a6-478c-a094-502c9332ac7b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:04:34 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:04:34.865 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[50417dd4-05e9-4201-bafc-6b3aff79323e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:04:34 np0005546954 systemd-machined[153497]: New machine qemu-21-instance-00000018.
Dec  5 08:04:34 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:04:34.877 104542 DEBUG oslo.privsep.daemon [-] privsep: reply[2a366d2a-e1e7-43ee-95b1-fc6a1ae8199d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:04:34 np0005546954 systemd[1]: Started Virtual Machine qemu-21-instance-00000018.
Dec  5 08:04:34 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:04:34.889 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[7a2e83c8-8909-432f-a029-0cc7bfaed6f0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:04:34 np0005546954 systemd-udevd[216981]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 08:04:34 np0005546954 NetworkManager[55665]: <info>  [1764939874.9056] device (tapc0abe331-dd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 08:04:34 np0005546954 NetworkManager[55665]: <info>  [1764939874.9069] device (tapc0abe331-dd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 08:04:34 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:04:34.922 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[fe2f392a-7630-467e-a5f9-147216a227de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:04:34 np0005546954 systemd-udevd[216986]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 08:04:34 np0005546954 NetworkManager[55665]: <info>  [1764939874.9293] manager: (tapd4389bc8-20): new Veth device (/org/freedesktop/NetworkManager/Devices/87)
Dec  5 08:04:34 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:04:34.928 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[5dcec0cd-0744-4274-94e3-87ea2ee68e38]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:04:34 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:04:34.963 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[3fa82402-b658-4b56-b941-ba60b9b356e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:04:34 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:04:34.967 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[9eafd9fa-b05b-464c-8ff9-92e26b4329e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:04:34 np0005546954 NetworkManager[55665]: <info>  [1764939874.9876] device (tapd4389bc8-20): carrier: link connected
Dec  5 08:04:34 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:04:34.991 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[0ea1a3d6-7574-4bbb-836b-371ba76a599b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:04:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:04:35.004 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[4949c069-e94e-4660-8151-d58a2a1e9dd5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd4389bc8-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:43:f7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 61], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 488161, 'reachable_time': 37797, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217011, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:04:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:04:35.019 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[cff8cf3e-2c58-4197-b857-7cff773b7d9b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7c:43f7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 488161, 'tstamp': 488161}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217012, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:04:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:04:35.035 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[f290e6ae-6483-4e85-901b-cc876964a054]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd4389bc8-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:43:f7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 61], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 488161, 'reachable_time': 37797, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217013, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:04:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:04:35.058 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[c6246f7f-f32f-483d-ab3f-dbf506f81153]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:04:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:04:35.122 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[a94f4511-7544-46e3-8e10-23b1f2925a89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:04:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:04:35.124 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4389bc8-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:04:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:04:35.124 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 08:04:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:04:35.125 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd4389bc8-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:04:35 np0005546954 nova_compute[187160]: 2025-12-05 13:04:35.128 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:04:35 np0005546954 kernel: tapd4389bc8-20: entered promiscuous mode
Dec  5 08:04:35 np0005546954 NetworkManager[55665]: <info>  [1764939875.1305] manager: (tapd4389bc8-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/88)
Dec  5 08:04:35 np0005546954 nova_compute[187160]: 2025-12-05 13:04:35.132 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:04:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:04:35.136 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd4389bc8-20, col_values=(('external_ids', {'iface-id': '8dbe2af5-9f18-44ca-8f22-66854bcdd596'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:04:35 np0005546954 nova_compute[187160]: 2025-12-05 13:04:35.138 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:04:35 np0005546954 ovn_controller[95566]: 2025-12-05T13:04:35Z|00220|binding|INFO|Releasing lport 8dbe2af5-9f18-44ca-8f22-66854bcdd596 from this chassis (sb_readonly=0)
Dec  5 08:04:35 np0005546954 nova_compute[187160]: 2025-12-05 13:04:35.139 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:04:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:04:35.141 104428 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d4389bc8-2898-48b0-9741-5183b54fe83c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d4389bc8-2898-48b0-9741-5183b54fe83c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 08:04:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:04:35.142 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[7a3810e9-1b93-467f-8d9a-4ebbd298bee9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:04:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:04:35.143 104428 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 08:04:35 np0005546954 ovn_metadata_agent[104423]: global
Dec  5 08:04:35 np0005546954 ovn_metadata_agent[104423]:    log         /dev/log local0 debug
Dec  5 08:04:35 np0005546954 ovn_metadata_agent[104423]:    log-tag     haproxy-metadata-proxy-d4389bc8-2898-48b0-9741-5183b54fe83c
Dec  5 08:04:35 np0005546954 ovn_metadata_agent[104423]:    user        root
Dec  5 08:04:35 np0005546954 ovn_metadata_agent[104423]:    group       root
Dec  5 08:04:35 np0005546954 ovn_metadata_agent[104423]:    maxconn     1024
Dec  5 08:04:35 np0005546954 ovn_metadata_agent[104423]:    pidfile     /var/lib/neutron/external/pids/d4389bc8-2898-48b0-9741-5183b54fe83c.pid.haproxy
Dec  5 08:04:35 np0005546954 ovn_metadata_agent[104423]:    daemon
Dec  5 08:04:35 np0005546954 ovn_metadata_agent[104423]: 
Dec  5 08:04:35 np0005546954 ovn_metadata_agent[104423]: defaults
Dec  5 08:04:35 np0005546954 ovn_metadata_agent[104423]:    log global
Dec  5 08:04:35 np0005546954 ovn_metadata_agent[104423]:    mode http
Dec  5 08:04:35 np0005546954 ovn_metadata_agent[104423]:    option httplog
Dec  5 08:04:35 np0005546954 ovn_metadata_agent[104423]:    option dontlognull
Dec  5 08:04:35 np0005546954 ovn_metadata_agent[104423]:    option http-server-close
Dec  5 08:04:35 np0005546954 ovn_metadata_agent[104423]:    option forwardfor
Dec  5 08:04:35 np0005546954 ovn_metadata_agent[104423]:    retries                 3
Dec  5 08:04:35 np0005546954 ovn_metadata_agent[104423]:    timeout http-request    30s
Dec  5 08:04:35 np0005546954 ovn_metadata_agent[104423]:    timeout connect         30s
Dec  5 08:04:35 np0005546954 ovn_metadata_agent[104423]:    timeout client          32s
Dec  5 08:04:35 np0005546954 ovn_metadata_agent[104423]:    timeout server          32s
Dec  5 08:04:35 np0005546954 ovn_metadata_agent[104423]:    timeout http-keep-alive 30s
Dec  5 08:04:35 np0005546954 ovn_metadata_agent[104423]: 
Dec  5 08:04:35 np0005546954 ovn_metadata_agent[104423]: 
Dec  5 08:04:35 np0005546954 ovn_metadata_agent[104423]: listen listener
Dec  5 08:04:35 np0005546954 ovn_metadata_agent[104423]:    bind 169.254.169.254:80
Dec  5 08:04:35 np0005546954 ovn_metadata_agent[104423]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 08:04:35 np0005546954 ovn_metadata_agent[104423]:    http-request add-header X-OVN-Network-ID d4389bc8-2898-48b0-9741-5183b54fe83c
Dec  5 08:04:35 np0005546954 ovn_metadata_agent[104423]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 08:04:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:04:35.145 104428 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'env', 'PROCESS_TAG=haproxy-d4389bc8-2898-48b0-9741-5183b54fe83c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d4389bc8-2898-48b0-9741-5183b54fe83c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 08:04:35 np0005546954 nova_compute[187160]: 2025-12-05 13:04:35.153 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:04:35 np0005546954 nova_compute[187160]: 2025-12-05 13:04:35.419 187164 DEBUG nova.compute.manager [req-041e1967-80f7-48fc-a41b-389638a9019b req-24b3185c-819f-4bc4-8078-9ab5a2c2e500 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Received event network-vif-plugged-c0abe331-dd3a-4aee-9aaa-7f89d13ef185 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:04:35 np0005546954 nova_compute[187160]: 2025-12-05 13:04:35.419 187164 DEBUG oslo_concurrency.lockutils [req-041e1967-80f7-48fc-a41b-389638a9019b req-24b3185c-819f-4bc4-8078-9ab5a2c2e500 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "76812e6f-bda5-495b-be99-2ff8c5960729-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:04:35 np0005546954 nova_compute[187160]: 2025-12-05 13:04:35.420 187164 DEBUG oslo_concurrency.lockutils [req-041e1967-80f7-48fc-a41b-389638a9019b req-24b3185c-819f-4bc4-8078-9ab5a2c2e500 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "76812e6f-bda5-495b-be99-2ff8c5960729-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:04:35 np0005546954 nova_compute[187160]: 2025-12-05 13:04:35.420 187164 DEBUG oslo_concurrency.lockutils [req-041e1967-80f7-48fc-a41b-389638a9019b req-24b3185c-819f-4bc4-8078-9ab5a2c2e500 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "76812e6f-bda5-495b-be99-2ff8c5960729-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:04:35 np0005546954 nova_compute[187160]: 2025-12-05 13:04:35.421 187164 DEBUG nova.compute.manager [req-041e1967-80f7-48fc-a41b-389638a9019b req-24b3185c-819f-4bc4-8078-9ab5a2c2e500 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Processing event network-vif-plugged-c0abe331-dd3a-4aee-9aaa-7f89d13ef185 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 08:04:35 np0005546954 podman[217045]: 2025-12-05 13:04:35.54711477 +0000 UTC m=+0.065599132 container create ba3abdfc349a90b8a5c8ac5317245f41d95c5265b61edd443e9eedffb1f69ed9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 08:04:35 np0005546954 systemd[1]: Started libpod-conmon-ba3abdfc349a90b8a5c8ac5317245f41d95c5265b61edd443e9eedffb1f69ed9.scope.
Dec  5 08:04:35 np0005546954 podman[217045]: 2025-12-05 13:04:35.506456821 +0000 UTC m=+0.024941243 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 08:04:35 np0005546954 systemd[1]: Started libcrun container.
Dec  5 08:04:35 np0005546954 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8abec67e0299a3bd0c624d708a4a7bcb2c4bfedc3f2287738dad4d8c5724e6cc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 08:04:35 np0005546954 podman[217045]: 2025-12-05 13:04:35.628122877 +0000 UTC m=+0.146607249 container init ba3abdfc349a90b8a5c8ac5317245f41d95c5265b61edd443e9eedffb1f69ed9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 08:04:35 np0005546954 podman[217045]: 2025-12-05 13:04:35.633290397 +0000 UTC m=+0.151774749 container start ba3abdfc349a90b8a5c8ac5317245f41d95c5265b61edd443e9eedffb1f69ed9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec  5 08:04:35 np0005546954 podman[197513]: time="2025-12-05T13:04:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:04:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:04:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  5 08:04:35 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[217060]: [NOTICE]   (217064) : New worker (217066) forked
Dec  5 08:04:35 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[217060]: [NOTICE]   (217064) : Loading success.
Dec  5 08:04:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:04:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3051 "" "Go-http-client/1.1"
Dec  5 08:04:35 np0005546954 nova_compute[187160]: 2025-12-05 13:04:35.744 187164 DEBUG nova.network.neutron [req-47b1fcce-fc8c-42ae-ad41-5589d02a5f2d req-ef60b2cc-354e-447c-ba00-ab142b685fcf 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Updated VIF entry in instance network info cache for port c0abe331-dd3a-4aee-9aaa-7f89d13ef185. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 08:04:35 np0005546954 nova_compute[187160]: 2025-12-05 13:04:35.745 187164 DEBUG nova.network.neutron [req-47b1fcce-fc8c-42ae-ad41-5589d02a5f2d req-ef60b2cc-354e-447c-ba00-ab142b685fcf 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Updating instance_info_cache with network_info: [{"id": "c0abe331-dd3a-4aee-9aaa-7f89d13ef185", "address": "fa:16:3e:a2:39:ba", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0abe331-dd", "ovs_interfaceid": "c0abe331-dd3a-4aee-9aaa-7f89d13ef185", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 08:04:35 np0005546954 nova_compute[187160]: 2025-12-05 13:04:35.762 187164 DEBUG oslo_concurrency.lockutils [req-47b1fcce-fc8c-42ae-ad41-5589d02a5f2d req-ef60b2cc-354e-447c-ba00-ab142b685fcf 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Releasing lock "refresh_cache-76812e6f-bda5-495b-be99-2ff8c5960729" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 08:04:36 np0005546954 nova_compute[187160]: 2025-12-05 13:04:36.074 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764939876.073807, 76812e6f-bda5-495b-be99-2ff8c5960729 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 08:04:36 np0005546954 nova_compute[187160]: 2025-12-05 13:04:36.074 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] VM Started (Lifecycle Event)#033[00m
Dec  5 08:04:36 np0005546954 nova_compute[187160]: 2025-12-05 13:04:36.076 187164 DEBUG nova.compute.manager [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 08:04:36 np0005546954 nova_compute[187160]: 2025-12-05 13:04:36.080 187164 DEBUG nova.virt.libvirt.driver [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 08:04:36 np0005546954 nova_compute[187160]: 2025-12-05 13:04:36.084 187164 INFO nova.virt.libvirt.driver [-] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Instance spawned successfully.#033[00m
Dec  5 08:04:36 np0005546954 nova_compute[187160]: 2025-12-05 13:04:36.084 187164 DEBUG nova.virt.libvirt.driver [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 08:04:36 np0005546954 nova_compute[187160]: 2025-12-05 13:04:36.085 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:04:36 np0005546954 nova_compute[187160]: 2025-12-05 13:04:36.093 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 08:04:36 np0005546954 nova_compute[187160]: 2025-12-05 13:04:36.098 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 08:04:36 np0005546954 nova_compute[187160]: 2025-12-05 13:04:36.102 187164 DEBUG nova.virt.libvirt.driver [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 08:04:36 np0005546954 nova_compute[187160]: 2025-12-05 13:04:36.102 187164 DEBUG nova.virt.libvirt.driver [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 08:04:36 np0005546954 nova_compute[187160]: 2025-12-05 13:04:36.103 187164 DEBUG nova.virt.libvirt.driver [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 08:04:36 np0005546954 nova_compute[187160]: 2025-12-05 13:04:36.103 187164 DEBUG nova.virt.libvirt.driver [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 08:04:36 np0005546954 nova_compute[187160]: 2025-12-05 13:04:36.103 187164 DEBUG nova.virt.libvirt.driver [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 08:04:36 np0005546954 nova_compute[187160]: 2025-12-05 13:04:36.104 187164 DEBUG nova.virt.libvirt.driver [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 08:04:36 np0005546954 nova_compute[187160]: 2025-12-05 13:04:36.140 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 08:04:36 np0005546954 nova_compute[187160]: 2025-12-05 13:04:36.140 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764939876.0763402, 76812e6f-bda5-495b-be99-2ff8c5960729 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 08:04:36 np0005546954 nova_compute[187160]: 2025-12-05 13:04:36.140 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] VM Paused (Lifecycle Event)#033[00m
Dec  5 08:04:36 np0005546954 nova_compute[187160]: 2025-12-05 13:04:36.174 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 08:04:36 np0005546954 nova_compute[187160]: 2025-12-05 13:04:36.176 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764939876.0796869, 76812e6f-bda5-495b-be99-2ff8c5960729 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 08:04:36 np0005546954 nova_compute[187160]: 2025-12-05 13:04:36.176 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] VM Resumed (Lifecycle Event)#033[00m
Dec  5 08:04:36 np0005546954 nova_compute[187160]: 2025-12-05 13:04:36.202 187164 INFO nova.compute.manager [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Took 4.56 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 08:04:36 np0005546954 nova_compute[187160]: 2025-12-05 13:04:36.202 187164 DEBUG nova.compute.manager [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 08:04:36 np0005546954 nova_compute[187160]: 2025-12-05 13:04:36.214 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 08:04:36 np0005546954 nova_compute[187160]: 2025-12-05 13:04:36.216 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 08:04:36 np0005546954 nova_compute[187160]: 2025-12-05 13:04:36.242 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 08:04:36 np0005546954 nova_compute[187160]: 2025-12-05 13:04:36.274 187164 INFO nova.compute.manager [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Took 7.95 seconds to build instance.#033[00m
Dec  5 08:04:36 np0005546954 nova_compute[187160]: 2025-12-05 13:04:36.294 187164 DEBUG oslo_concurrency.lockutils [None req-cd61be21-1b9a-4c3b-bd34-c97d29aedf7e 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "76812e6f-bda5-495b-be99-2ff8c5960729" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.922s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:04:37 np0005546954 nova_compute[187160]: 2025-12-05 13:04:37.511 187164 DEBUG nova.compute.manager [req-6a3875f6-e0bd-4f6b-8083-2323e42dabf6 req-35ef91c7-b9b5-4c0c-b65d-e5f48dff274d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Received event network-vif-plugged-c0abe331-dd3a-4aee-9aaa-7f89d13ef185 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:04:37 np0005546954 nova_compute[187160]: 2025-12-05 13:04:37.512 187164 DEBUG oslo_concurrency.lockutils [req-6a3875f6-e0bd-4f6b-8083-2323e42dabf6 req-35ef91c7-b9b5-4c0c-b65d-e5f48dff274d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "76812e6f-bda5-495b-be99-2ff8c5960729-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:04:37 np0005546954 nova_compute[187160]: 2025-12-05 13:04:37.512 187164 DEBUG oslo_concurrency.lockutils [req-6a3875f6-e0bd-4f6b-8083-2323e42dabf6 req-35ef91c7-b9b5-4c0c-b65d-e5f48dff274d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "76812e6f-bda5-495b-be99-2ff8c5960729-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:04:37 np0005546954 nova_compute[187160]: 2025-12-05 13:04:37.512 187164 DEBUG oslo_concurrency.lockutils [req-6a3875f6-e0bd-4f6b-8083-2323e42dabf6 req-35ef91c7-b9b5-4c0c-b65d-e5f48dff274d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "76812e6f-bda5-495b-be99-2ff8c5960729-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:04:37 np0005546954 nova_compute[187160]: 2025-12-05 13:04:37.513 187164 DEBUG nova.compute.manager [req-6a3875f6-e0bd-4f6b-8083-2323e42dabf6 req-35ef91c7-b9b5-4c0c-b65d-e5f48dff274d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] No waiting events found dispatching network-vif-plugged-c0abe331-dd3a-4aee-9aaa-7f89d13ef185 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 08:04:37 np0005546954 nova_compute[187160]: 2025-12-05 13:04:37.513 187164 WARNING nova.compute.manager [req-6a3875f6-e0bd-4f6b-8083-2323e42dabf6 req-35ef91c7-b9b5-4c0c-b65d-e5f48dff274d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Received unexpected event network-vif-plugged-c0abe331-dd3a-4aee-9aaa-7f89d13ef185 for instance with vm_state active and task_state None.#033[00m
Dec  5 08:04:39 np0005546954 nova_compute[187160]: 2025-12-05 13:04:39.201 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:04:41 np0005546954 nova_compute[187160]: 2025-12-05 13:04:41.088 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:04:43 np0005546954 podman[217082]: 2025-12-05 13:04:43.593618091 +0000 UTC m=+0.079175472 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Dec  5 08:04:44 np0005546954 nova_compute[187160]: 2025-12-05 13:04:44.206 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:04:46 np0005546954 nova_compute[187160]: 2025-12-05 13:04:46.092 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:04:48 np0005546954 podman[217113]: 2025-12-05 13:04:48.566996657 +0000 UTC m=+0.068105148 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec  5 08:04:48 np0005546954 podman[217112]: 2025-12-05 13:04:48.626472348 +0000 UTC m=+0.127658191 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec  5 08:04:48 np0005546954 ovn_controller[95566]: 2025-12-05T13:04:48Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a2:39:ba 10.100.0.14
Dec  5 08:04:48 np0005546954 ovn_controller[95566]: 2025-12-05T13:04:48Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a2:39:ba 10.100.0.14
Dec  5 08:04:49 np0005546954 nova_compute[187160]: 2025-12-05 13:04:49.208 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:04:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:04:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:04:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:04:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:04:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:04:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:04:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:04:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:04:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:04:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:04:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:04:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:04:51 np0005546954 nova_compute[187160]: 2025-12-05 13:04:51.094 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:04:54 np0005546954 nova_compute[187160]: 2025-12-05 13:04:54.213 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:04:56 np0005546954 nova_compute[187160]: 2025-12-05 13:04:56.098 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:04:59 np0005546954 nova_compute[187160]: 2025-12-05 13:04:59.215 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:00 np0005546954 podman[217161]: 2025-12-05 13:05:00.553017807 +0000 UTC m=+0.059549045 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 08:05:00 np0005546954 podman[217160]: 2025-12-05 13:05:00.554215533 +0000 UTC m=+0.066592232 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_id=edpm, distribution-scope=public, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec  5 08:05:01 np0005546954 nova_compute[187160]: 2025-12-05 13:05:01.101 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:04 np0005546954 nova_compute[187160]: 2025-12-05 13:05:04.220 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:04 np0005546954 ovn_controller[95566]: 2025-12-05T13:05:04Z|00221|memory_trim|INFO|Detected inactivity (last active 30016 ms ago): trimming memory
Dec  5 08:05:05 np0005546954 nova_compute[187160]: 2025-12-05 13:05:05.368 187164 DEBUG nova.virt.libvirt.driver [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 86c73a1d-eb82-4ab9-9714-0a0dc3f57225] Creating tmpfile /var/lib/nova/instances/tmpwgahxche to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Dec  5 08:05:05 np0005546954 nova_compute[187160]: 2025-12-05 13:05:05.490 187164 DEBUG nova.compute.manager [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpwgahxche',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Dec  5 08:05:05 np0005546954 podman[197513]: time="2025-12-05T13:05:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:05:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:05:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  5 08:05:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:05:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3056 "" "Go-http-client/1.1"
Dec  5 08:05:06 np0005546954 nova_compute[187160]: 2025-12-05 13:05:06.103 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:06 np0005546954 nova_compute[187160]: 2025-12-05 13:05:06.323 187164 DEBUG nova.compute.manager [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpwgahxche',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='86c73a1d-eb82-4ab9-9714-0a0dc3f57225',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Dec  5 08:05:06 np0005546954 nova_compute[187160]: 2025-12-05 13:05:06.353 187164 DEBUG oslo_concurrency.lockutils [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "refresh_cache-86c73a1d-eb82-4ab9-9714-0a0dc3f57225" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 08:05:06 np0005546954 nova_compute[187160]: 2025-12-05 13:05:06.354 187164 DEBUG oslo_concurrency.lockutils [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquired lock "refresh_cache-86c73a1d-eb82-4ab9-9714-0a0dc3f57225" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 08:05:06 np0005546954 nova_compute[187160]: 2025-12-05 13:05:06.354 187164 DEBUG nova.network.neutron [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 86c73a1d-eb82-4ab9-9714-0a0dc3f57225] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 08:05:06 np0005546954 nova_compute[187160]: 2025-12-05 13:05:06.873 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:05:07 np0005546954 nova_compute[187160]: 2025-12-05 13:05:07.886 187164 DEBUG nova.network.neutron [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 86c73a1d-eb82-4ab9-9714-0a0dc3f57225] Updating instance_info_cache with network_info: [{"id": "0bfa1c75-3860-493b-8ae8-1b6935a9b91a", "address": "fa:16:3e:fc:93:3b", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0bfa1c75-38", "ovs_interfaceid": "0bfa1c75-3860-493b-8ae8-1b6935a9b91a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 08:05:07 np0005546954 nova_compute[187160]: 2025-12-05 13:05:07.911 187164 DEBUG oslo_concurrency.lockutils [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Releasing lock "refresh_cache-86c73a1d-eb82-4ab9-9714-0a0dc3f57225" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 08:05:07 np0005546954 nova_compute[187160]: 2025-12-05 13:05:07.913 187164 DEBUG nova.virt.libvirt.driver [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 86c73a1d-eb82-4ab9-9714-0a0dc3f57225] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpwgahxche',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='86c73a1d-eb82-4ab9-9714-0a0dc3f57225',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Dec  5 08:05:07 np0005546954 nova_compute[187160]: 2025-12-05 13:05:07.913 187164 DEBUG nova.virt.libvirt.driver [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 86c73a1d-eb82-4ab9-9714-0a0dc3f57225] Creating instance directory: /var/lib/nova/instances/86c73a1d-eb82-4ab9-9714-0a0dc3f57225 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Dec  5 08:05:07 np0005546954 nova_compute[187160]: 2025-12-05 13:05:07.914 187164 DEBUG nova.virt.libvirt.driver [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 86c73a1d-eb82-4ab9-9714-0a0dc3f57225] Creating disk.info with the contents: {'/var/lib/nova/instances/86c73a1d-eb82-4ab9-9714-0a0dc3f57225/disk': 'qcow2', '/var/lib/nova/instances/86c73a1d-eb82-4ab9-9714-0a0dc3f57225/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Dec  5 08:05:07 np0005546954 nova_compute[187160]: 2025-12-05 13:05:07.914 187164 DEBUG nova.virt.libvirt.driver [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 86c73a1d-eb82-4ab9-9714-0a0dc3f57225] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Dec  5 08:05:07 np0005546954 nova_compute[187160]: 2025-12-05 13:05:07.915 187164 DEBUG nova.objects.instance [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lazy-loading 'trusted_certs' on Instance uuid 86c73a1d-eb82-4ab9-9714-0a0dc3f57225 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 08:05:07 np0005546954 nova_compute[187160]: 2025-12-05 13:05:07.947 187164 DEBUG oslo_concurrency.processutils [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:05:08 np0005546954 nova_compute[187160]: 2025-12-05 13:05:08.013 187164 DEBUG oslo_concurrency.processutils [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:05:08 np0005546954 nova_compute[187160]: 2025-12-05 13:05:08.015 187164 DEBUG oslo_concurrency.lockutils [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:05:08 np0005546954 nova_compute[187160]: 2025-12-05 13:05:08.015 187164 DEBUG oslo_concurrency.lockutils [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:05:08 np0005546954 nova_compute[187160]: 2025-12-05 13:05:08.029 187164 DEBUG oslo_concurrency.processutils [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:05:08 np0005546954 nova_compute[187160]: 2025-12-05 13:05:08.102 187164 DEBUG oslo_concurrency.processutils [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:05:08 np0005546954 nova_compute[187160]: 2025-12-05 13:05:08.104 187164 DEBUG oslo_concurrency.processutils [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/86c73a1d-eb82-4ab9-9714-0a0dc3f57225/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:05:08 np0005546954 nova_compute[187160]: 2025-12-05 13:05:08.296 187164 DEBUG oslo_concurrency.processutils [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/86c73a1d-eb82-4ab9-9714-0a0dc3f57225/disk 1073741824" returned: 0 in 0.192s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:05:08 np0005546954 nova_compute[187160]: 2025-12-05 13:05:08.298 187164 DEBUG oslo_concurrency.lockutils [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.283s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:05:08 np0005546954 nova_compute[187160]: 2025-12-05 13:05:08.299 187164 DEBUG oslo_concurrency.processutils [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:05:08 np0005546954 nova_compute[187160]: 2025-12-05 13:05:08.390 187164 DEBUG oslo_concurrency.processutils [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:05:08 np0005546954 nova_compute[187160]: 2025-12-05 13:05:08.391 187164 DEBUG nova.virt.disk.api [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Checking if we can resize image /var/lib/nova/instances/86c73a1d-eb82-4ab9-9714-0a0dc3f57225/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 08:05:08 np0005546954 nova_compute[187160]: 2025-12-05 13:05:08.391 187164 DEBUG oslo_concurrency.processutils [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/86c73a1d-eb82-4ab9-9714-0a0dc3f57225/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:05:08 np0005546954 nova_compute[187160]: 2025-12-05 13:05:08.467 187164 DEBUG oslo_concurrency.processutils [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/86c73a1d-eb82-4ab9-9714-0a0dc3f57225/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:05:08 np0005546954 nova_compute[187160]: 2025-12-05 13:05:08.469 187164 DEBUG nova.virt.disk.api [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Cannot resize image /var/lib/nova/instances/86c73a1d-eb82-4ab9-9714-0a0dc3f57225/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 08:05:08 np0005546954 nova_compute[187160]: 2025-12-05 13:05:08.469 187164 DEBUG nova.objects.instance [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lazy-loading 'migration_context' on Instance uuid 86c73a1d-eb82-4ab9-9714-0a0dc3f57225 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 08:05:08 np0005546954 nova_compute[187160]: 2025-12-05 13:05:08.570 187164 DEBUG oslo_concurrency.processutils [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/86c73a1d-eb82-4ab9-9714-0a0dc3f57225/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:05:08 np0005546954 nova_compute[187160]: 2025-12-05 13:05:08.602 187164 DEBUG oslo_concurrency.processutils [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/86c73a1d-eb82-4ab9-9714-0a0dc3f57225/disk.config 485376" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:05:08 np0005546954 nova_compute[187160]: 2025-12-05 13:05:08.605 187164 DEBUG nova.virt.libvirt.volume.remotefs [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/86c73a1d-eb82-4ab9-9714-0a0dc3f57225/disk.config to /var/lib/nova/instances/86c73a1d-eb82-4ab9-9714-0a0dc3f57225 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Dec  5 08:05:08 np0005546954 nova_compute[187160]: 2025-12-05 13:05:08.606 187164 DEBUG oslo_concurrency.processutils [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/86c73a1d-eb82-4ab9-9714-0a0dc3f57225/disk.config /var/lib/nova/instances/86c73a1d-eb82-4ab9-9714-0a0dc3f57225 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:05:09 np0005546954 nova_compute[187160]: 2025-12-05 13:05:09.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:05:09 np0005546954 nova_compute[187160]: 2025-12-05 13:05:09.041 187164 DEBUG oslo_concurrency.processutils [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/86c73a1d-eb82-4ab9-9714-0a0dc3f57225/disk.config /var/lib/nova/instances/86c73a1d-eb82-4ab9-9714-0a0dc3f57225" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:05:09 np0005546954 nova_compute[187160]: 2025-12-05 13:05:09.043 187164 DEBUG nova.virt.libvirt.driver [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 86c73a1d-eb82-4ab9-9714-0a0dc3f57225] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Dec  5 08:05:09 np0005546954 nova_compute[187160]: 2025-12-05 13:05:09.044 187164 DEBUG nova.virt.libvirt.vif [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T13:04:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-256337887',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-256337887',id=23,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T13:04:14Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e6ae0d0dcde04b85b6dae45560cca988',ramdisk_id='',reservation_id='r-guc6ewze',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-192029678',owner_user_name='tempest-TestExecuteStrategies-192029678-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T13:04:14Z,user_data=None,user_id='0ae0bb20ac8b4be99eb1abddc7310436',uuid=86c73a1d-eb82-4ab9-9714-0a0dc3f57225,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0bfa1c75-3860-493b-8ae8-1b6935a9b91a", "address": "fa:16:3e:fc:93:3b", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap0bfa1c75-38", "ovs_interfaceid": "0bfa1c75-3860-493b-8ae8-1b6935a9b91a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 08:05:09 np0005546954 nova_compute[187160]: 2025-12-05 13:05:09.045 187164 DEBUG nova.network.os_vif_util [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Converting VIF {"id": "0bfa1c75-3860-493b-8ae8-1b6935a9b91a", "address": "fa:16:3e:fc:93:3b", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap0bfa1c75-38", "ovs_interfaceid": "0bfa1c75-3860-493b-8ae8-1b6935a9b91a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 08:05:09 np0005546954 nova_compute[187160]: 2025-12-05 13:05:09.046 187164 DEBUG nova.network.os_vif_util [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fc:93:3b,bridge_name='br-int',has_traffic_filtering=True,id=0bfa1c75-3860-493b-8ae8-1b6935a9b91a,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0bfa1c75-38') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 08:05:09 np0005546954 nova_compute[187160]: 2025-12-05 13:05:09.047 187164 DEBUG os_vif [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fc:93:3b,bridge_name='br-int',has_traffic_filtering=True,id=0bfa1c75-3860-493b-8ae8-1b6935a9b91a,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0bfa1c75-38') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 08:05:09 np0005546954 nova_compute[187160]: 2025-12-05 13:05:09.048 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:09 np0005546954 nova_compute[187160]: 2025-12-05 13:05:09.049 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:05:09 np0005546954 nova_compute[187160]: 2025-12-05 13:05:09.050 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 08:05:09 np0005546954 nova_compute[187160]: 2025-12-05 13:05:09.054 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:09 np0005546954 nova_compute[187160]: 2025-12-05 13:05:09.054 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0bfa1c75-38, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:05:09 np0005546954 nova_compute[187160]: 2025-12-05 13:05:09.055 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0bfa1c75-38, col_values=(('external_ids', {'iface-id': '0bfa1c75-3860-493b-8ae8-1b6935a9b91a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fc:93:3b', 'vm-uuid': '86c73a1d-eb82-4ab9-9714-0a0dc3f57225'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:05:09 np0005546954 nova_compute[187160]: 2025-12-05 13:05:09.057 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:09 np0005546954 NetworkManager[55665]: <info>  [1764939909.0580] manager: (tap0bfa1c75-38): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/89)
Dec  5 08:05:09 np0005546954 nova_compute[187160]: 2025-12-05 13:05:09.060 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 08:05:09 np0005546954 nova_compute[187160]: 2025-12-05 13:05:09.068 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:09 np0005546954 nova_compute[187160]: 2025-12-05 13:05:09.069 187164 INFO os_vif [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fc:93:3b,bridge_name='br-int',has_traffic_filtering=True,id=0bfa1c75-3860-493b-8ae8-1b6935a9b91a,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0bfa1c75-38')#033[00m
Dec  5 08:05:09 np0005546954 nova_compute[187160]: 2025-12-05 13:05:09.070 187164 DEBUG nova.virt.libvirt.driver [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Dec  5 08:05:09 np0005546954 nova_compute[187160]: 2025-12-05 13:05:09.070 187164 DEBUG nova.compute.manager [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpwgahxche',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='86c73a1d-eb82-4ab9-9714-0a0dc3f57225',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Dec  5 08:05:10 np0005546954 nova_compute[187160]: 2025-12-05 13:05:10.074 187164 DEBUG nova.network.neutron [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 86c73a1d-eb82-4ab9-9714-0a0dc3f57225] Port 0bfa1c75-3860-493b-8ae8-1b6935a9b91a updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Dec  5 08:05:10 np0005546954 nova_compute[187160]: 2025-12-05 13:05:10.076 187164 DEBUG nova.compute.manager [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpwgahxche',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='86c73a1d-eb82-4ab9-9714-0a0dc3f57225',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Dec  5 08:05:10 np0005546954 systemd[1]: Starting libvirt proxy daemon...
Dec  5 08:05:10 np0005546954 systemd[1]: Started libvirt proxy daemon.
Dec  5 08:05:10 np0005546954 kernel: tap0bfa1c75-38: entered promiscuous mode
Dec  5 08:05:10 np0005546954 NetworkManager[55665]: <info>  [1764939910.4322] manager: (tap0bfa1c75-38): new Tun device (/org/freedesktop/NetworkManager/Devices/90)
Dec  5 08:05:10 np0005546954 ovn_controller[95566]: 2025-12-05T13:05:10Z|00222|binding|INFO|Claiming lport 0bfa1c75-3860-493b-8ae8-1b6935a9b91a for this additional chassis.
Dec  5 08:05:10 np0005546954 ovn_controller[95566]: 2025-12-05T13:05:10Z|00223|binding|INFO|0bfa1c75-3860-493b-8ae8-1b6935a9b91a: Claiming fa:16:3e:fc:93:3b 10.100.0.6
Dec  5 08:05:10 np0005546954 nova_compute[187160]: 2025-12-05 13:05:10.436 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:10 np0005546954 ovn_controller[95566]: 2025-12-05T13:05:10Z|00224|binding|INFO|Setting lport 0bfa1c75-3860-493b-8ae8-1b6935a9b91a ovn-installed in OVS
Dec  5 08:05:10 np0005546954 nova_compute[187160]: 2025-12-05 13:05:10.463 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:10 np0005546954 nova_compute[187160]: 2025-12-05 13:05:10.467 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:10 np0005546954 systemd-udevd[217255]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 08:05:10 np0005546954 NetworkManager[55665]: <info>  [1764939910.4839] device (tap0bfa1c75-38): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 08:05:10 np0005546954 NetworkManager[55665]: <info>  [1764939910.4864] device (tap0bfa1c75-38): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 08:05:10 np0005546954 systemd-machined[153497]: New machine qemu-22-instance-00000017.
Dec  5 08:05:10 np0005546954 systemd[1]: Started Virtual Machine qemu-22-instance-00000017.
Dec  5 08:05:10 np0005546954 nova_compute[187160]: 2025-12-05 13:05:10.848 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764939910.8474896, 86c73a1d-eb82-4ab9-9714-0a0dc3f57225 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 08:05:10 np0005546954 nova_compute[187160]: 2025-12-05 13:05:10.849 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 86c73a1d-eb82-4ab9-9714-0a0dc3f57225] VM Started (Lifecycle Event)#033[00m
Dec  5 08:05:10 np0005546954 nova_compute[187160]: 2025-12-05 13:05:10.881 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 86c73a1d-eb82-4ab9-9714-0a0dc3f57225] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 08:05:11 np0005546954 nova_compute[187160]: 2025-12-05 13:05:11.105 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:11 np0005546954 nova_compute[187160]: 2025-12-05 13:05:11.168 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:05:11 np0005546954 nova_compute[187160]: 2025-12-05 13:05:11.169 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 08:05:11 np0005546954 nova_compute[187160]: 2025-12-05 13:05:11.169 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 08:05:11 np0005546954 nova_compute[187160]: 2025-12-05 13:05:11.334 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "refresh_cache-76812e6f-bda5-495b-be99-2ff8c5960729" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 08:05:11 np0005546954 nova_compute[187160]: 2025-12-05 13:05:11.334 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquired lock "refresh_cache-76812e6f-bda5-495b-be99-2ff8c5960729" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 08:05:11 np0005546954 nova_compute[187160]: 2025-12-05 13:05:11.334 187164 DEBUG nova.network.neutron [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec  5 08:05:11 np0005546954 nova_compute[187160]: 2025-12-05 13:05:11.334 187164 DEBUG nova.objects.instance [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 76812e6f-bda5-495b-be99-2ff8c5960729 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 08:05:11 np0005546954 nova_compute[187160]: 2025-12-05 13:05:11.620 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764939911.620324, 86c73a1d-eb82-4ab9-9714-0a0dc3f57225 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 08:05:11 np0005546954 nova_compute[187160]: 2025-12-05 13:05:11.621 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 86c73a1d-eb82-4ab9-9714-0a0dc3f57225] VM Resumed (Lifecycle Event)#033[00m
Dec  5 08:05:11 np0005546954 nova_compute[187160]: 2025-12-05 13:05:11.644 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 86c73a1d-eb82-4ab9-9714-0a0dc3f57225] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 08:05:11 np0005546954 nova_compute[187160]: 2025-12-05 13:05:11.648 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 86c73a1d-eb82-4ab9-9714-0a0dc3f57225] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 08:05:11 np0005546954 nova_compute[187160]: 2025-12-05 13:05:11.674 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 86c73a1d-eb82-4ab9-9714-0a0dc3f57225] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Dec  5 08:05:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:12.574 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2a:56:46', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '32:90:88:ab:74:32'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 08:05:12 np0005546954 nova_compute[187160]: 2025-12-05 13:05:12.575 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:12.577 104428 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 08:05:12 np0005546954 ovn_controller[95566]: 2025-12-05T13:05:12Z|00225|binding|INFO|Claiming lport 0bfa1c75-3860-493b-8ae8-1b6935a9b91a for this chassis.
Dec  5 08:05:12 np0005546954 ovn_controller[95566]: 2025-12-05T13:05:12Z|00226|binding|INFO|0bfa1c75-3860-493b-8ae8-1b6935a9b91a: Claiming fa:16:3e:fc:93:3b 10.100.0.6
Dec  5 08:05:12 np0005546954 ovn_controller[95566]: 2025-12-05T13:05:12Z|00227|binding|INFO|Setting lport 0bfa1c75-3860-493b-8ae8-1b6935a9b91a up in Southbound
Dec  5 08:05:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:12.611 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fc:93:3b 10.100.0.6'], port_security=['fa:16:3e:fc:93:3b 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '86c73a1d-eb82-4ab9-9714-0a0dc3f57225', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4389bc8-2898-48b0-9741-5183b54fe83c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e6ae0d0dcde04b85b6dae45560cca988', 'neutron:revision_number': '11', 'neutron:security_group_ids': '9ea68f98-ae7c-4c35-bc5a-7c1a27f7e5f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb60c317-acba-4c06-b29b-f7c6c7a5660a, chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=0bfa1c75-3860-493b-8ae8-1b6935a9b91a) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 08:05:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:12.613 104428 INFO neutron.agent.ovn.metadata.agent [-] Port 0bfa1c75-3860-493b-8ae8-1b6935a9b91a in datapath d4389bc8-2898-48b0-9741-5183b54fe83c bound to our chassis#033[00m
Dec  5 08:05:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:12.617 104428 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d4389bc8-2898-48b0-9741-5183b54fe83c#033[00m
Dec  5 08:05:12 np0005546954 nova_compute[187160]: 2025-12-05 13:05:12.621 187164 DEBUG nova.network.neutron [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Updating instance_info_cache with network_info: [{"id": "c0abe331-dd3a-4aee-9aaa-7f89d13ef185", "address": "fa:16:3e:a2:39:ba", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0abe331-dd", "ovs_interfaceid": "c0abe331-dd3a-4aee-9aaa-7f89d13ef185", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 08:05:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:12.636 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[cbf1f5c8-f34f-4899-9371-67b006db35cc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:05:12 np0005546954 nova_compute[187160]: 2025-12-05 13:05:12.645 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Releasing lock "refresh_cache-76812e6f-bda5-495b-be99-2ff8c5960729" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 08:05:12 np0005546954 nova_compute[187160]: 2025-12-05 13:05:12.645 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec  5 08:05:12 np0005546954 nova_compute[187160]: 2025-12-05 13:05:12.646 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:05:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:12.670 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[70ca83ce-4e28-4865-9af3-f300ce876c73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:05:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:12.672 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[3e743f43-94a7-483c-b129-f2e566290c98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:05:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:12.705 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[88cf3cc8-8c0b-4076-a0a9-d35205cfc0cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:05:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:12.719 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[dd13b26d-29a8-4a26-b42f-0ebae4609d00]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd4389bc8-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:43:f7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 6, 'rx_bytes': 826, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 6, 'rx_bytes': 826, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 61], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 488161, 'reachable_time': 37797, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217286, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:05:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:12.737 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[45e90887-69bd-418f-ad62-ad1d49b97751]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd4389bc8-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 488171, 'tstamp': 488171}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217287, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd4389bc8-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 488174, 'tstamp': 488174}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217287, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:05:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:12.739 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4389bc8-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:05:12 np0005546954 nova_compute[187160]: 2025-12-05 13:05:12.741 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:12.743 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd4389bc8-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:05:12 np0005546954 nova_compute[187160]: 2025-12-05 13:05:12.743 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:12.743 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 08:05:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:12.744 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd4389bc8-20, col_values=(('external_ids', {'iface-id': '8dbe2af5-9f18-44ca-8f22-66854bcdd596'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:05:12 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:12.744 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 08:05:12 np0005546954 nova_compute[187160]: 2025-12-05 13:05:12.790 187164 INFO nova.compute.manager [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 86c73a1d-eb82-4ab9-9714-0a0dc3f57225] Post operation of migration started#033[00m
Dec  5 08:05:13 np0005546954 nova_compute[187160]: 2025-12-05 13:05:13.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:05:13 np0005546954 nova_compute[187160]: 2025-12-05 13:05:13.173 187164 DEBUG oslo_concurrency.lockutils [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "refresh_cache-86c73a1d-eb82-4ab9-9714-0a0dc3f57225" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 08:05:13 np0005546954 nova_compute[187160]: 2025-12-05 13:05:13.173 187164 DEBUG oslo_concurrency.lockutils [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquired lock "refresh_cache-86c73a1d-eb82-4ab9-9714-0a0dc3f57225" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 08:05:13 np0005546954 nova_compute[187160]: 2025-12-05 13:05:13.174 187164 DEBUG nova.network.neutron [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 86c73a1d-eb82-4ab9-9714-0a0dc3f57225] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 08:05:14 np0005546954 nova_compute[187160]: 2025-12-05 13:05:14.058 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:14 np0005546954 podman[217288]: 2025-12-05 13:05:14.606434633 +0000 UTC m=+0.108396276 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  5 08:05:15 np0005546954 nova_compute[187160]: 2025-12-05 13:05:15.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:05:15 np0005546954 nova_compute[187160]: 2025-12-05 13:05:15.039 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  5 08:05:15 np0005546954 nova_compute[187160]: 2025-12-05 13:05:15.054 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  5 08:05:15 np0005546954 nova_compute[187160]: 2025-12-05 13:05:15.903 187164 DEBUG nova.network.neutron [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 86c73a1d-eb82-4ab9-9714-0a0dc3f57225] Updating instance_info_cache with network_info: [{"id": "0bfa1c75-3860-493b-8ae8-1b6935a9b91a", "address": "fa:16:3e:fc:93:3b", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0bfa1c75-38", "ovs_interfaceid": "0bfa1c75-3860-493b-8ae8-1b6935a9b91a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 08:05:15 np0005546954 nova_compute[187160]: 2025-12-05 13:05:15.921 187164 DEBUG oslo_concurrency.lockutils [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Releasing lock "refresh_cache-86c73a1d-eb82-4ab9-9714-0a0dc3f57225" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 08:05:15 np0005546954 nova_compute[187160]: 2025-12-05 13:05:15.940 187164 DEBUG oslo_concurrency.lockutils [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:05:15 np0005546954 nova_compute[187160]: 2025-12-05 13:05:15.941 187164 DEBUG oslo_concurrency.lockutils [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:05:15 np0005546954 nova_compute[187160]: 2025-12-05 13:05:15.942 187164 DEBUG oslo_concurrency.lockutils [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:05:15 np0005546954 nova_compute[187160]: 2025-12-05 13:05:15.948 187164 INFO nova.virt.libvirt.driver [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 86c73a1d-eb82-4ab9-9714-0a0dc3f57225] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Dec  5 08:05:15 np0005546954 virtqemud[186730]: Domain id=22 name='instance-00000017' uuid=86c73a1d-eb82-4ab9-9714-0a0dc3f57225 is tainted: custom-monitor
Dec  5 08:05:16 np0005546954 nova_compute[187160]: 2025-12-05 13:05:16.109 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:16.581 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f9f74c-08f9-451f-9678-93bb9e8fa2fe, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:05:16 np0005546954 nova_compute[187160]: 2025-12-05 13:05:16.959 187164 INFO nova.virt.libvirt.driver [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 86c73a1d-eb82-4ab9-9714-0a0dc3f57225] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Dec  5 08:05:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:16.968 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:05:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:16.969 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:05:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:16.970 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:05:17 np0005546954 nova_compute[187160]: 2025-12-05 13:05:17.050 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:05:17 np0005546954 nova_compute[187160]: 2025-12-05 13:05:17.051 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:05:17 np0005546954 nova_compute[187160]: 2025-12-05 13:05:17.051 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:05:17 np0005546954 nova_compute[187160]: 2025-12-05 13:05:17.966 187164 INFO nova.virt.libvirt.driver [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 86c73a1d-eb82-4ab9-9714-0a0dc3f57225] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Dec  5 08:05:17 np0005546954 nova_compute[187160]: 2025-12-05 13:05:17.971 187164 DEBUG nova.compute.manager [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 86c73a1d-eb82-4ab9-9714-0a0dc3f57225] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 08:05:17 np0005546954 nova_compute[187160]: 2025-12-05 13:05:17.994 187164 DEBUG nova.objects.instance [None req-d39ed993-604a-4996-99b1-419e019cba55 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 86c73a1d-eb82-4ab9-9714-0a0dc3f57225] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Dec  5 08:05:18 np0005546954 nova_compute[187160]: 2025-12-05 13:05:18.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:05:18 np0005546954 nova_compute[187160]: 2025-12-05 13:05:18.065 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:05:18 np0005546954 nova_compute[187160]: 2025-12-05 13:05:18.066 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:05:18 np0005546954 nova_compute[187160]: 2025-12-05 13:05:18.066 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:05:18 np0005546954 nova_compute[187160]: 2025-12-05 13:05:18.066 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 08:05:18 np0005546954 nova_compute[187160]: 2025-12-05 13:05:18.164 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/76812e6f-bda5-495b-be99-2ff8c5960729/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:05:18 np0005546954 nova_compute[187160]: 2025-12-05 13:05:18.257 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/76812e6f-bda5-495b-be99-2ff8c5960729/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:05:18 np0005546954 nova_compute[187160]: 2025-12-05 13:05:18.258 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/76812e6f-bda5-495b-be99-2ff8c5960729/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:05:18 np0005546954 nova_compute[187160]: 2025-12-05 13:05:18.314 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/76812e6f-bda5-495b-be99-2ff8c5960729/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:05:18 np0005546954 nova_compute[187160]: 2025-12-05 13:05:18.323 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/86c73a1d-eb82-4ab9-9714-0a0dc3f57225/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:05:18 np0005546954 nova_compute[187160]: 2025-12-05 13:05:18.389 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/86c73a1d-eb82-4ab9-9714-0a0dc3f57225/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:05:18 np0005546954 nova_compute[187160]: 2025-12-05 13:05:18.390 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/86c73a1d-eb82-4ab9-9714-0a0dc3f57225/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:05:18 np0005546954 nova_compute[187160]: 2025-12-05 13:05:18.483 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/86c73a1d-eb82-4ab9-9714-0a0dc3f57225/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:05:18 np0005546954 nova_compute[187160]: 2025-12-05 13:05:18.723 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 08:05:18 np0005546954 nova_compute[187160]: 2025-12-05 13:05:18.724 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5548MB free_disk=73.27188491821289GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 08:05:18 np0005546954 nova_compute[187160]: 2025-12-05 13:05:18.725 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:05:18 np0005546954 nova_compute[187160]: 2025-12-05 13:05:18.725 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:05:18 np0005546954 nova_compute[187160]: 2025-12-05 13:05:18.783 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Applying migration context for instance 86c73a1d-eb82-4ab9-9714-0a0dc3f57225 as it has an incoming, in-progress migration dc112f2e-59e8-4c53-9dab-1ada1e49a61d. Migration status is running _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:950#033[00m
Dec  5 08:05:18 np0005546954 nova_compute[187160]: 2025-12-05 13:05:18.784 187164 DEBUG nova.objects.instance [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: 86c73a1d-eb82-4ab9-9714-0a0dc3f57225] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Dec  5 08:05:18 np0005546954 nova_compute[187160]: 2025-12-05 13:05:18.799 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: 86c73a1d-eb82-4ab9-9714-0a0dc3f57225] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Dec  5 08:05:18 np0005546954 nova_compute[187160]: 2025-12-05 13:05:18.872 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Instance 76812e6f-bda5-495b-be99-2ff8c5960729 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 08:05:18 np0005546954 nova_compute[187160]: 2025-12-05 13:05:18.873 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Instance 86c73a1d-eb82-4ab9-9714-0a0dc3f57225 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 08:05:18 np0005546954 nova_compute[187160]: 2025-12-05 13:05:18.873 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 08:05:18 np0005546954 nova_compute[187160]: 2025-12-05 13:05:18.873 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 08:05:19 np0005546954 nova_compute[187160]: 2025-12-05 13:05:19.023 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 08:05:19 np0005546954 nova_compute[187160]: 2025-12-05 13:05:19.061 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:19 np0005546954 nova_compute[187160]: 2025-12-05 13:05:19.259 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 08:05:19 np0005546954 nova_compute[187160]: 2025-12-05 13:05:19.281 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 08:05:19 np0005546954 nova_compute[187160]: 2025-12-05 13:05:19.282 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.557s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:05:19 np0005546954 nova_compute[187160]: 2025-12-05 13:05:19.282 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:05:19 np0005546954 nova_compute[187160]: 2025-12-05 13:05:19.283 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  5 08:05:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:05:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:05:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:05:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:05:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:05:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:05:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:05:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:05:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:05:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:05:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:05:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:05:19 np0005546954 podman[217322]: 2025-12-05 13:05:19.59798656 +0000 UTC m=+0.082820404 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  5 08:05:19 np0005546954 podman[217321]: 2025-12-05 13:05:19.640392173 +0000 UTC m=+0.130429718 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  5 08:05:20 np0005546954 nova_compute[187160]: 2025-12-05 13:05:20.096 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:05:20 np0005546954 nova_compute[187160]: 2025-12-05 13:05:20.145 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Triggering sync for uuid 86c73a1d-eb82-4ab9-9714-0a0dc3f57225 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Dec  5 08:05:20 np0005546954 nova_compute[187160]: 2025-12-05 13:05:20.146 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Triggering sync for uuid 76812e6f-bda5-495b-be99-2ff8c5960729 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Dec  5 08:05:20 np0005546954 nova_compute[187160]: 2025-12-05 13:05:20.147 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "86c73a1d-eb82-4ab9-9714-0a0dc3f57225" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:05:20 np0005546954 nova_compute[187160]: 2025-12-05 13:05:20.147 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "86c73a1d-eb82-4ab9-9714-0a0dc3f57225" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:05:20 np0005546954 nova_compute[187160]: 2025-12-05 13:05:20.148 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "76812e6f-bda5-495b-be99-2ff8c5960729" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:05:20 np0005546954 nova_compute[187160]: 2025-12-05 13:05:20.149 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "76812e6f-bda5-495b-be99-2ff8c5960729" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:05:20 np0005546954 nova_compute[187160]: 2025-12-05 13:05:20.183 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "76812e6f-bda5-495b-be99-2ff8c5960729" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.034s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:05:20 np0005546954 nova_compute[187160]: 2025-12-05 13:05:20.185 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "86c73a1d-eb82-4ab9-9714-0a0dc3f57225" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.038s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:05:21 np0005546954 nova_compute[187160]: 2025-12-05 13:05:21.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:05:21 np0005546954 nova_compute[187160]: 2025-12-05 13:05:21.039 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 08:05:21 np0005546954 nova_compute[187160]: 2025-12-05 13:05:21.111 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:22 np0005546954 nova_compute[187160]: 2025-12-05 13:05:22.125 187164 DEBUG oslo_concurrency.lockutils [None req-f3c21f87-5cab-43cf-af50-ebd78ad378c4 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "76812e6f-bda5-495b-be99-2ff8c5960729" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:05:22 np0005546954 nova_compute[187160]: 2025-12-05 13:05:22.126 187164 DEBUG oslo_concurrency.lockutils [None req-f3c21f87-5cab-43cf-af50-ebd78ad378c4 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "76812e6f-bda5-495b-be99-2ff8c5960729" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:05:22 np0005546954 nova_compute[187160]: 2025-12-05 13:05:22.126 187164 DEBUG oslo_concurrency.lockutils [None req-f3c21f87-5cab-43cf-af50-ebd78ad378c4 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "76812e6f-bda5-495b-be99-2ff8c5960729-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:05:22 np0005546954 nova_compute[187160]: 2025-12-05 13:05:22.127 187164 DEBUG oslo_concurrency.lockutils [None req-f3c21f87-5cab-43cf-af50-ebd78ad378c4 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "76812e6f-bda5-495b-be99-2ff8c5960729-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:05:22 np0005546954 nova_compute[187160]: 2025-12-05 13:05:22.127 187164 DEBUG oslo_concurrency.lockutils [None req-f3c21f87-5cab-43cf-af50-ebd78ad378c4 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "76812e6f-bda5-495b-be99-2ff8c5960729-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:05:22 np0005546954 nova_compute[187160]: 2025-12-05 13:05:22.128 187164 INFO nova.compute.manager [None req-f3c21f87-5cab-43cf-af50-ebd78ad378c4 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Terminating instance#033[00m
Dec  5 08:05:22 np0005546954 nova_compute[187160]: 2025-12-05 13:05:22.130 187164 DEBUG nova.compute.manager [None req-f3c21f87-5cab-43cf-af50-ebd78ad378c4 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 08:05:22 np0005546954 kernel: tapc0abe331-dd (unregistering): left promiscuous mode
Dec  5 08:05:22 np0005546954 NetworkManager[55665]: <info>  [1764939922.1588] device (tapc0abe331-dd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 08:05:22 np0005546954 nova_compute[187160]: 2025-12-05 13:05:22.168 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:22 np0005546954 ovn_controller[95566]: 2025-12-05T13:05:22Z|00228|binding|INFO|Releasing lport c0abe331-dd3a-4aee-9aaa-7f89d13ef185 from this chassis (sb_readonly=0)
Dec  5 08:05:22 np0005546954 ovn_controller[95566]: 2025-12-05T13:05:22Z|00229|binding|INFO|Setting lport c0abe331-dd3a-4aee-9aaa-7f89d13ef185 down in Southbound
Dec  5 08:05:22 np0005546954 ovn_controller[95566]: 2025-12-05T13:05:22Z|00230|binding|INFO|Removing iface tapc0abe331-dd ovn-installed in OVS
Dec  5 08:05:22 np0005546954 nova_compute[187160]: 2025-12-05 13:05:22.170 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:22 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:22.176 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a2:39:ba 10.100.0.14'], port_security=['fa:16:3e:a2:39:ba 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '76812e6f-bda5-495b-be99-2ff8c5960729', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4389bc8-2898-48b0-9741-5183b54fe83c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e6ae0d0dcde04b85b6dae45560cca988', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9ea68f98-ae7c-4c35-bc5a-7c1a27f7e5f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb60c317-acba-4c06-b29b-f7c6c7a5660a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=c0abe331-dd3a-4aee-9aaa-7f89d13ef185) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 08:05:22 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:22.177 104428 INFO neutron.agent.ovn.metadata.agent [-] Port c0abe331-dd3a-4aee-9aaa-7f89d13ef185 in datapath d4389bc8-2898-48b0-9741-5183b54fe83c unbound from our chassis#033[00m
Dec  5 08:05:22 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:22.178 104428 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d4389bc8-2898-48b0-9741-5183b54fe83c#033[00m
Dec  5 08:05:22 np0005546954 nova_compute[187160]: 2025-12-05 13:05:22.187 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:22 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:22.197 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[c62433a4-fa61-496a-91a0-5b913082be54]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:05:22 np0005546954 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000018.scope: Deactivated successfully.
Dec  5 08:05:22 np0005546954 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000018.scope: Consumed 14.848s CPU time.
Dec  5 08:05:22 np0005546954 systemd-machined[153497]: Machine qemu-21-instance-00000018 terminated.
Dec  5 08:05:22 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:22.228 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[c59d5173-9a61-406b-a1e6-bed092779b44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:05:22 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:22.231 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[187bbc8d-c289-43f0-84dd-0748156499fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:05:22 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:22.259 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[6d0783b0-c7be-44af-8318-8ef98feef399]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:05:22 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:22.273 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[6f73979e-da97-4f52-84d1-b6d2f591e1a3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd4389bc8-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:43:f7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 8, 'rx_bytes': 1456, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 8, 'rx_bytes': 1456, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 61], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 488161, 'reachable_time': 37797, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217377, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:05:22 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:22.288 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[c4653e35-6ed2-4d56-903c-a9143c237ad8]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd4389bc8-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 488171, 'tstamp': 488171}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217378, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd4389bc8-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 488174, 'tstamp': 488174}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217378, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:05:22 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:22.289 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4389bc8-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:05:22 np0005546954 nova_compute[187160]: 2025-12-05 13:05:22.290 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:22 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:22.295 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd4389bc8-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:05:22 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:22.295 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 08:05:22 np0005546954 nova_compute[187160]: 2025-12-05 13:05:22.295 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:22 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:22.296 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd4389bc8-20, col_values=(('external_ids', {'iface-id': '8dbe2af5-9f18-44ca-8f22-66854bcdd596'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:05:22 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:22.296 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 08:05:22 np0005546954 nova_compute[187160]: 2025-12-05 13:05:22.399 187164 INFO nova.virt.libvirt.driver [-] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Instance destroyed successfully.#033[00m
Dec  5 08:05:22 np0005546954 nova_compute[187160]: 2025-12-05 13:05:22.400 187164 DEBUG nova.objects.instance [None req-f3c21f87-5cab-43cf-af50-ebd78ad378c4 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lazy-loading 'resources' on Instance uuid 76812e6f-bda5-495b-be99-2ff8c5960729 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 08:05:22 np0005546954 nova_compute[187160]: 2025-12-05 13:05:22.414 187164 DEBUG nova.virt.libvirt.vif [None req-f3c21f87-5cab-43cf-af50-ebd78ad378c4 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T13:04:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-778787294',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-778787294',id=24,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T13:04:36Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e6ae0d0dcde04b85b6dae45560cca988',ramdisk_id='',reservation_id='r-sbkc9t88',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-192029678',owner_user_name='tempest-TestExecuteStrategies-192029678-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T13:04:36Z,user_data=None,user_id='0ae0bb20ac8b4be99eb1abddc7310436',uuid=76812e6f-bda5-495b-be99-2ff8c5960729,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c0abe331-dd3a-4aee-9aaa-7f89d13ef185", "address": "fa:16:3e:a2:39:ba", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0abe331-dd", "ovs_interfaceid": "c0abe331-dd3a-4aee-9aaa-7f89d13ef185", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 08:05:22 np0005546954 nova_compute[187160]: 2025-12-05 13:05:22.415 187164 DEBUG nova.network.os_vif_util [None req-f3c21f87-5cab-43cf-af50-ebd78ad378c4 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converting VIF {"id": "c0abe331-dd3a-4aee-9aaa-7f89d13ef185", "address": "fa:16:3e:a2:39:ba", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0abe331-dd", "ovs_interfaceid": "c0abe331-dd3a-4aee-9aaa-7f89d13ef185", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 08:05:22 np0005546954 nova_compute[187160]: 2025-12-05 13:05:22.415 187164 DEBUG nova.network.os_vif_util [None req-f3c21f87-5cab-43cf-af50-ebd78ad378c4 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a2:39:ba,bridge_name='br-int',has_traffic_filtering=True,id=c0abe331-dd3a-4aee-9aaa-7f89d13ef185,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0abe331-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 08:05:22 np0005546954 nova_compute[187160]: 2025-12-05 13:05:22.416 187164 DEBUG os_vif [None req-f3c21f87-5cab-43cf-af50-ebd78ad378c4 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a2:39:ba,bridge_name='br-int',has_traffic_filtering=True,id=c0abe331-dd3a-4aee-9aaa-7f89d13ef185,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0abe331-dd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 08:05:22 np0005546954 nova_compute[187160]: 2025-12-05 13:05:22.417 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:22 np0005546954 nova_compute[187160]: 2025-12-05 13:05:22.417 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc0abe331-dd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:05:22 np0005546954 nova_compute[187160]: 2025-12-05 13:05:22.419 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:22 np0005546954 nova_compute[187160]: 2025-12-05 13:05:22.421 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:22 np0005546954 nova_compute[187160]: 2025-12-05 13:05:22.423 187164 INFO os_vif [None req-f3c21f87-5cab-43cf-af50-ebd78ad378c4 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a2:39:ba,bridge_name='br-int',has_traffic_filtering=True,id=c0abe331-dd3a-4aee-9aaa-7f89d13ef185,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0abe331-dd')#033[00m
Dec  5 08:05:22 np0005546954 nova_compute[187160]: 2025-12-05 13:05:22.423 187164 INFO nova.virt.libvirt.driver [None req-f3c21f87-5cab-43cf-af50-ebd78ad378c4 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Deleting instance files /var/lib/nova/instances/76812e6f-bda5-495b-be99-2ff8c5960729_del#033[00m
Dec  5 08:05:22 np0005546954 nova_compute[187160]: 2025-12-05 13:05:22.424 187164 INFO nova.virt.libvirt.driver [None req-f3c21f87-5cab-43cf-af50-ebd78ad378c4 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Deletion of /var/lib/nova/instances/76812e6f-bda5-495b-be99-2ff8c5960729_del complete#033[00m
Dec  5 08:05:22 np0005546954 nova_compute[187160]: 2025-12-05 13:05:22.527 187164 INFO nova.compute.manager [None req-f3c21f87-5cab-43cf-af50-ebd78ad378c4 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 08:05:22 np0005546954 nova_compute[187160]: 2025-12-05 13:05:22.528 187164 DEBUG oslo.service.loopingcall [None req-f3c21f87-5cab-43cf-af50-ebd78ad378c4 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 08:05:22 np0005546954 nova_compute[187160]: 2025-12-05 13:05:22.528 187164 DEBUG nova.compute.manager [-] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 08:05:22 np0005546954 nova_compute[187160]: 2025-12-05 13:05:22.528 187164 DEBUG nova.network.neutron [-] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 08:05:23 np0005546954 nova_compute[187160]: 2025-12-05 13:05:23.075 187164 DEBUG nova.compute.manager [req-b86ebcf9-5f6d-499a-ad3d-34b8bc9467ea req-1f53d74d-2d4b-458c-a8cf-6c841a737432 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Received event network-vif-unplugged-c0abe331-dd3a-4aee-9aaa-7f89d13ef185 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:05:23 np0005546954 nova_compute[187160]: 2025-12-05 13:05:23.076 187164 DEBUG oslo_concurrency.lockutils [req-b86ebcf9-5f6d-499a-ad3d-34b8bc9467ea req-1f53d74d-2d4b-458c-a8cf-6c841a737432 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "76812e6f-bda5-495b-be99-2ff8c5960729-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:05:23 np0005546954 nova_compute[187160]: 2025-12-05 13:05:23.077 187164 DEBUG oslo_concurrency.lockutils [req-b86ebcf9-5f6d-499a-ad3d-34b8bc9467ea req-1f53d74d-2d4b-458c-a8cf-6c841a737432 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "76812e6f-bda5-495b-be99-2ff8c5960729-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:05:23 np0005546954 nova_compute[187160]: 2025-12-05 13:05:23.077 187164 DEBUG oslo_concurrency.lockutils [req-b86ebcf9-5f6d-499a-ad3d-34b8bc9467ea req-1f53d74d-2d4b-458c-a8cf-6c841a737432 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "76812e6f-bda5-495b-be99-2ff8c5960729-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:05:23 np0005546954 nova_compute[187160]: 2025-12-05 13:05:23.078 187164 DEBUG nova.compute.manager [req-b86ebcf9-5f6d-499a-ad3d-34b8bc9467ea req-1f53d74d-2d4b-458c-a8cf-6c841a737432 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] No waiting events found dispatching network-vif-unplugged-c0abe331-dd3a-4aee-9aaa-7f89d13ef185 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 08:05:23 np0005546954 nova_compute[187160]: 2025-12-05 13:05:23.078 187164 DEBUG nova.compute.manager [req-b86ebcf9-5f6d-499a-ad3d-34b8bc9467ea req-1f53d74d-2d4b-458c-a8cf-6c841a737432 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Received event network-vif-unplugged-c0abe331-dd3a-4aee-9aaa-7f89d13ef185 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  5 08:05:24 np0005546954 nova_compute[187160]: 2025-12-05 13:05:24.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:05:24 np0005546954 nova_compute[187160]: 2025-12-05 13:05:24.234 187164 DEBUG nova.network.neutron [-] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 08:05:24 np0005546954 nova_compute[187160]: 2025-12-05 13:05:24.257 187164 INFO nova.compute.manager [-] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Took 1.73 seconds to deallocate network for instance.#033[00m
Dec  5 08:05:24 np0005546954 nova_compute[187160]: 2025-12-05 13:05:24.301 187164 DEBUG nova.compute.manager [req-d634cb7a-3742-46f2-ae4f-8e68dcc0fcaa req-421579c6-fb46-49a2-9ec9-0416136bc83c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Received event network-vif-deleted-c0abe331-dd3a-4aee-9aaa-7f89d13ef185 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:05:24 np0005546954 nova_compute[187160]: 2025-12-05 13:05:24.319 187164 DEBUG oslo_concurrency.lockutils [None req-f3c21f87-5cab-43cf-af50-ebd78ad378c4 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:05:24 np0005546954 nova_compute[187160]: 2025-12-05 13:05:24.320 187164 DEBUG oslo_concurrency.lockutils [None req-f3c21f87-5cab-43cf-af50-ebd78ad378c4 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:05:24 np0005546954 nova_compute[187160]: 2025-12-05 13:05:24.409 187164 DEBUG nova.compute.provider_tree [None req-f3c21f87-5cab-43cf-af50-ebd78ad378c4 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 08:05:24 np0005546954 nova_compute[187160]: 2025-12-05 13:05:24.443 187164 DEBUG nova.scheduler.client.report [None req-f3c21f87-5cab-43cf-af50-ebd78ad378c4 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 08:05:24 np0005546954 nova_compute[187160]: 2025-12-05 13:05:24.483 187164 DEBUG oslo_concurrency.lockutils [None req-f3c21f87-5cab-43cf-af50-ebd78ad378c4 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.163s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:05:24 np0005546954 nova_compute[187160]: 2025-12-05 13:05:24.515 187164 INFO nova.scheduler.client.report [None req-f3c21f87-5cab-43cf-af50-ebd78ad378c4 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Deleted allocations for instance 76812e6f-bda5-495b-be99-2ff8c5960729#033[00m
Dec  5 08:05:24 np0005546954 nova_compute[187160]: 2025-12-05 13:05:24.718 187164 DEBUG oslo_concurrency.lockutils [None req-f3c21f87-5cab-43cf-af50-ebd78ad378c4 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "76812e6f-bda5-495b-be99-2ff8c5960729" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.592s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:05:25 np0005546954 nova_compute[187160]: 2025-12-05 13:05:25.034 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:05:25 np0005546954 nova_compute[187160]: 2025-12-05 13:05:25.304 187164 DEBUG oslo_concurrency.lockutils [None req-52106e5d-af1e-412f-8732-c209ef3d5359 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "86c73a1d-eb82-4ab9-9714-0a0dc3f57225" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:05:25 np0005546954 nova_compute[187160]: 2025-12-05 13:05:25.305 187164 DEBUG oslo_concurrency.lockutils [None req-52106e5d-af1e-412f-8732-c209ef3d5359 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "86c73a1d-eb82-4ab9-9714-0a0dc3f57225" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:05:25 np0005546954 nova_compute[187160]: 2025-12-05 13:05:25.305 187164 DEBUG oslo_concurrency.lockutils [None req-52106e5d-af1e-412f-8732-c209ef3d5359 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "86c73a1d-eb82-4ab9-9714-0a0dc3f57225-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:05:25 np0005546954 nova_compute[187160]: 2025-12-05 13:05:25.305 187164 DEBUG oslo_concurrency.lockutils [None req-52106e5d-af1e-412f-8732-c209ef3d5359 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "86c73a1d-eb82-4ab9-9714-0a0dc3f57225-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:05:25 np0005546954 nova_compute[187160]: 2025-12-05 13:05:25.306 187164 DEBUG oslo_concurrency.lockutils [None req-52106e5d-af1e-412f-8732-c209ef3d5359 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "86c73a1d-eb82-4ab9-9714-0a0dc3f57225-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:05:25 np0005546954 nova_compute[187160]: 2025-12-05 13:05:25.307 187164 INFO nova.compute.manager [None req-52106e5d-af1e-412f-8732-c209ef3d5359 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 86c73a1d-eb82-4ab9-9714-0a0dc3f57225] Terminating instance#033[00m
Dec  5 08:05:25 np0005546954 nova_compute[187160]: 2025-12-05 13:05:25.308 187164 DEBUG nova.compute.manager [None req-52106e5d-af1e-412f-8732-c209ef3d5359 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 86c73a1d-eb82-4ab9-9714-0a0dc3f57225] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 08:05:25 np0005546954 kernel: tap0bfa1c75-38 (unregistering): left promiscuous mode
Dec  5 08:05:25 np0005546954 NetworkManager[55665]: <info>  [1764939925.3344] device (tap0bfa1c75-38): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 08:05:25 np0005546954 nova_compute[187160]: 2025-12-05 13:05:25.338 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:25 np0005546954 ovn_controller[95566]: 2025-12-05T13:05:25Z|00231|binding|INFO|Releasing lport 0bfa1c75-3860-493b-8ae8-1b6935a9b91a from this chassis (sb_readonly=0)
Dec  5 08:05:25 np0005546954 ovn_controller[95566]: 2025-12-05T13:05:25Z|00232|binding|INFO|Setting lport 0bfa1c75-3860-493b-8ae8-1b6935a9b91a down in Southbound
Dec  5 08:05:25 np0005546954 ovn_controller[95566]: 2025-12-05T13:05:25Z|00233|binding|INFO|Removing iface tap0bfa1c75-38 ovn-installed in OVS
Dec  5 08:05:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:25.344 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fc:93:3b 10.100.0.6'], port_security=['fa:16:3e:fc:93:3b 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '86c73a1d-eb82-4ab9-9714-0a0dc3f57225', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4389bc8-2898-48b0-9741-5183b54fe83c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e6ae0d0dcde04b85b6dae45560cca988', 'neutron:revision_number': '13', 'neutron:security_group_ids': '9ea68f98-ae7c-4c35-bc5a-7c1a27f7e5f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb60c317-acba-4c06-b29b-f7c6c7a5660a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=0bfa1c75-3860-493b-8ae8-1b6935a9b91a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 08:05:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:25.346 104428 INFO neutron.agent.ovn.metadata.agent [-] Port 0bfa1c75-3860-493b-8ae8-1b6935a9b91a in datapath d4389bc8-2898-48b0-9741-5183b54fe83c unbound from our chassis#033[00m
Dec  5 08:05:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:25.348 104428 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d4389bc8-2898-48b0-9741-5183b54fe83c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 08:05:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:25.349 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[7860c8a3-d178-45fd-a52e-f72299cd6a10]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:05:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:25.350 104428 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c namespace which is not needed anymore#033[00m
Dec  5 08:05:25 np0005546954 nova_compute[187160]: 2025-12-05 13:05:25.355 187164 DEBUG nova.compute.manager [req-28a41af7-fec7-4367-b463-0a3ed1e6d5e7 req-7227bee6-b787-4ff7-af55-a7901e5c5cbe 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Received event network-vif-plugged-c0abe331-dd3a-4aee-9aaa-7f89d13ef185 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:05:25 np0005546954 nova_compute[187160]: 2025-12-05 13:05:25.355 187164 DEBUG oslo_concurrency.lockutils [req-28a41af7-fec7-4367-b463-0a3ed1e6d5e7 req-7227bee6-b787-4ff7-af55-a7901e5c5cbe 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "76812e6f-bda5-495b-be99-2ff8c5960729-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:05:25 np0005546954 nova_compute[187160]: 2025-12-05 13:05:25.356 187164 DEBUG oslo_concurrency.lockutils [req-28a41af7-fec7-4367-b463-0a3ed1e6d5e7 req-7227bee6-b787-4ff7-af55-a7901e5c5cbe 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "76812e6f-bda5-495b-be99-2ff8c5960729-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:05:25 np0005546954 nova_compute[187160]: 2025-12-05 13:05:25.356 187164 DEBUG oslo_concurrency.lockutils [req-28a41af7-fec7-4367-b463-0a3ed1e6d5e7 req-7227bee6-b787-4ff7-af55-a7901e5c5cbe 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "76812e6f-bda5-495b-be99-2ff8c5960729-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:05:25 np0005546954 nova_compute[187160]: 2025-12-05 13:05:25.356 187164 DEBUG nova.compute.manager [req-28a41af7-fec7-4367-b463-0a3ed1e6d5e7 req-7227bee6-b787-4ff7-af55-a7901e5c5cbe 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] No waiting events found dispatching network-vif-plugged-c0abe331-dd3a-4aee-9aaa-7f89d13ef185 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 08:05:25 np0005546954 nova_compute[187160]: 2025-12-05 13:05:25.357 187164 WARNING nova.compute.manager [req-28a41af7-fec7-4367-b463-0a3ed1e6d5e7 req-7227bee6-b787-4ff7-af55-a7901e5c5cbe 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Received unexpected event network-vif-plugged-c0abe331-dd3a-4aee-9aaa-7f89d13ef185 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 08:05:25 np0005546954 nova_compute[187160]: 2025-12-05 13:05:25.357 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:25 np0005546954 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d00000017.scope: Deactivated successfully.
Dec  5 08:05:25 np0005546954 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d00000017.scope: Consumed 1.476s CPU time.
Dec  5 08:05:25 np0005546954 systemd-machined[153497]: Machine qemu-22-instance-00000017 terminated.
Dec  5 08:05:25 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[217060]: [NOTICE]   (217064) : haproxy version is 2.8.14-c23fe91
Dec  5 08:05:25 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[217060]: [NOTICE]   (217064) : path to executable is /usr/sbin/haproxy
Dec  5 08:05:25 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[217060]: [WARNING]  (217064) : Exiting Master process...
Dec  5 08:05:25 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[217060]: [WARNING]  (217064) : Exiting Master process...
Dec  5 08:05:25 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[217060]: [ALERT]    (217064) : Current worker (217066) exited with code 143 (Terminated)
Dec  5 08:05:25 np0005546954 neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c[217060]: [WARNING]  (217064) : All workers exited. Exiting... (0)
Dec  5 08:05:25 np0005546954 systemd[1]: libpod-ba3abdfc349a90b8a5c8ac5317245f41d95c5265b61edd443e9eedffb1f69ed9.scope: Deactivated successfully.
Dec  5 08:05:25 np0005546954 nova_compute[187160]: 2025-12-05 13:05:25.564 187164 INFO nova.virt.libvirt.driver [-] [instance: 86c73a1d-eb82-4ab9-9714-0a0dc3f57225] Instance destroyed successfully.#033[00m
Dec  5 08:05:25 np0005546954 nova_compute[187160]: 2025-12-05 13:05:25.564 187164 DEBUG nova.objects.instance [None req-52106e5d-af1e-412f-8732-c209ef3d5359 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lazy-loading 'resources' on Instance uuid 86c73a1d-eb82-4ab9-9714-0a0dc3f57225 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 08:05:25 np0005546954 podman[217421]: 2025-12-05 13:05:25.565062714 +0000 UTC m=+0.135815305 container died ba3abdfc349a90b8a5c8ac5317245f41d95c5265b61edd443e9eedffb1f69ed9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec  5 08:05:25 np0005546954 nova_compute[187160]: 2025-12-05 13:05:25.577 187164 DEBUG nova.virt.libvirt.vif [None req-52106e5d-af1e-412f-8732-c209ef3d5359 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T13:04:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-256337887',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-256337887',id=23,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T13:04:14Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e6ae0d0dcde04b85b6dae45560cca988',ramdisk_id='',reservation_id='r-guc6ewze',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-192029678',owner_user_name='tempest-TestExecuteStrategies-192029678-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T13:05:18Z,user_data=None,user_id='0ae0bb20ac8b4be99eb1abddc7310436',uuid=86c73a1d-eb82-4ab9-9714-0a0dc3f57225,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0bfa1c75-3860-493b-8ae8-1b6935a9b91a", "address": "fa:16:3e:fc:93:3b", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0bfa1c75-38", "ovs_interfaceid": "0bfa1c75-3860-493b-8ae8-1b6935a9b91a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 08:05:25 np0005546954 nova_compute[187160]: 2025-12-05 13:05:25.577 187164 DEBUG nova.network.os_vif_util [None req-52106e5d-af1e-412f-8732-c209ef3d5359 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converting VIF {"id": "0bfa1c75-3860-493b-8ae8-1b6935a9b91a", "address": "fa:16:3e:fc:93:3b", "network": {"id": "d4389bc8-2898-48b0-9741-5183b54fe83c", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-338715231-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6ae0d0dcde04b85b6dae45560cca988", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0bfa1c75-38", "ovs_interfaceid": "0bfa1c75-3860-493b-8ae8-1b6935a9b91a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 08:05:25 np0005546954 nova_compute[187160]: 2025-12-05 13:05:25.578 187164 DEBUG nova.network.os_vif_util [None req-52106e5d-af1e-412f-8732-c209ef3d5359 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:fc:93:3b,bridge_name='br-int',has_traffic_filtering=True,id=0bfa1c75-3860-493b-8ae8-1b6935a9b91a,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0bfa1c75-38') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 08:05:25 np0005546954 nova_compute[187160]: 2025-12-05 13:05:25.578 187164 DEBUG os_vif [None req-52106e5d-af1e-412f-8732-c209ef3d5359 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:fc:93:3b,bridge_name='br-int',has_traffic_filtering=True,id=0bfa1c75-3860-493b-8ae8-1b6935a9b91a,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0bfa1c75-38') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 08:05:25 np0005546954 nova_compute[187160]: 2025-12-05 13:05:25.580 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:25 np0005546954 nova_compute[187160]: 2025-12-05 13:05:25.580 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0bfa1c75-38, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:05:25 np0005546954 nova_compute[187160]: 2025-12-05 13:05:25.582 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:25 np0005546954 nova_compute[187160]: 2025-12-05 13:05:25.583 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:25 np0005546954 nova_compute[187160]: 2025-12-05 13:05:25.585 187164 INFO os_vif [None req-52106e5d-af1e-412f-8732-c209ef3d5359 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:fc:93:3b,bridge_name='br-int',has_traffic_filtering=True,id=0bfa1c75-3860-493b-8ae8-1b6935a9b91a,network=Network(d4389bc8-2898-48b0-9741-5183b54fe83c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0bfa1c75-38')#033[00m
Dec  5 08:05:25 np0005546954 nova_compute[187160]: 2025-12-05 13:05:25.586 187164 INFO nova.virt.libvirt.driver [None req-52106e5d-af1e-412f-8732-c209ef3d5359 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 86c73a1d-eb82-4ab9-9714-0a0dc3f57225] Deleting instance files /var/lib/nova/instances/86c73a1d-eb82-4ab9-9714-0a0dc3f57225_del#033[00m
Dec  5 08:05:25 np0005546954 nova_compute[187160]: 2025-12-05 13:05:25.587 187164 INFO nova.virt.libvirt.driver [None req-52106e5d-af1e-412f-8732-c209ef3d5359 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 86c73a1d-eb82-4ab9-9714-0a0dc3f57225] Deletion of /var/lib/nova/instances/86c73a1d-eb82-4ab9-9714-0a0dc3f57225_del complete#033[00m
Dec  5 08:05:25 np0005546954 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ba3abdfc349a90b8a5c8ac5317245f41d95c5265b61edd443e9eedffb1f69ed9-userdata-shm.mount: Deactivated successfully.
Dec  5 08:05:25 np0005546954 systemd[1]: var-lib-containers-storage-overlay-8abec67e0299a3bd0c624d708a4a7bcb2c4bfedc3f2287738dad4d8c5724e6cc-merged.mount: Deactivated successfully.
Dec  5 08:05:25 np0005546954 podman[217421]: 2025-12-05 13:05:25.62471526 +0000 UTC m=+0.195467841 container cleanup ba3abdfc349a90b8a5c8ac5317245f41d95c5265b61edd443e9eedffb1f69ed9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  5 08:05:25 np0005546954 systemd[1]: libpod-conmon-ba3abdfc349a90b8a5c8ac5317245f41d95c5265b61edd443e9eedffb1f69ed9.scope: Deactivated successfully.
Dec  5 08:05:25 np0005546954 nova_compute[187160]: 2025-12-05 13:05:25.637 187164 INFO nova.compute.manager [None req-52106e5d-af1e-412f-8732-c209ef3d5359 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] [instance: 86c73a1d-eb82-4ab9-9714-0a0dc3f57225] Took 0.33 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 08:05:25 np0005546954 nova_compute[187160]: 2025-12-05 13:05:25.637 187164 DEBUG oslo.service.loopingcall [None req-52106e5d-af1e-412f-8732-c209ef3d5359 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 08:05:25 np0005546954 nova_compute[187160]: 2025-12-05 13:05:25.638 187164 DEBUG nova.compute.manager [-] [instance: 86c73a1d-eb82-4ab9-9714-0a0dc3f57225] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 08:05:25 np0005546954 nova_compute[187160]: 2025-12-05 13:05:25.638 187164 DEBUG nova.network.neutron [-] [instance: 86c73a1d-eb82-4ab9-9714-0a0dc3f57225] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 08:05:25 np0005546954 podman[217469]: 2025-12-05 13:05:25.684511231 +0000 UTC m=+0.041260408 container remove ba3abdfc349a90b8a5c8ac5317245f41d95c5265b61edd443e9eedffb1f69ed9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 08:05:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:25.689 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[a3a74286-c741-4034-b8c0-c1d3c35f438e]: (4, ('Fri Dec  5 01:05:25 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c (ba3abdfc349a90b8a5c8ac5317245f41d95c5265b61edd443e9eedffb1f69ed9)\nba3abdfc349a90b8a5c8ac5317245f41d95c5265b61edd443e9eedffb1f69ed9\nFri Dec  5 01:05:25 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c (ba3abdfc349a90b8a5c8ac5317245f41d95c5265b61edd443e9eedffb1f69ed9)\nba3abdfc349a90b8a5c8ac5317245f41d95c5265b61edd443e9eedffb1f69ed9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:05:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:25.691 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[c1f70360-813d-4a79-ac8c-fd7c37028e0a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:05:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:25.691 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4389bc8-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:05:25 np0005546954 kernel: tapd4389bc8-20: left promiscuous mode
Dec  5 08:05:25 np0005546954 nova_compute[187160]: 2025-12-05 13:05:25.694 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:25.697 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[0b51fa57-e961-40d6-a468-b6c1f5c97b86]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:05:25 np0005546954 nova_compute[187160]: 2025-12-05 13:05:25.705 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:25.712 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[10dde500-4ce5-4d20-94b7-2a02dce7552f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:05:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:25.713 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[182850a9-b614-4c86-8af5-a79d8ddd4eb6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:05:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:25.730 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[5dd302bf-c7e2-4eeb-bd25-26206cbbe792]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 488154, 'reachable_time': 23107, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217483, 'error': None, 'target': 'ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:05:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:25.733 104542 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d4389bc8-2898-48b0-9741-5183b54fe83c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 08:05:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:05:25.733 104542 DEBUG oslo.privsep.daemon [-] privsep: reply[b866c5e1-8b3b-46ba-aebb-9db24d51a8af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:05:25 np0005546954 systemd[1]: run-netns-ovnmeta\x2dd4389bc8\x2d2898\x2d48b0\x2d9741\x2d5183b54fe83c.mount: Deactivated successfully.
Dec  5 08:05:26 np0005546954 nova_compute[187160]: 2025-12-05 13:05:26.113 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:26 np0005546954 nova_compute[187160]: 2025-12-05 13:05:26.399 187164 DEBUG nova.compute.manager [req-328e7f9b-2ef3-4f7a-9631-6cb9ee6c5b01 req-cc57461f-75b0-4135-9620-b5ff7a37451f 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 86c73a1d-eb82-4ab9-9714-0a0dc3f57225] Received event network-vif-unplugged-0bfa1c75-3860-493b-8ae8-1b6935a9b91a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:05:26 np0005546954 nova_compute[187160]: 2025-12-05 13:05:26.400 187164 DEBUG oslo_concurrency.lockutils [req-328e7f9b-2ef3-4f7a-9631-6cb9ee6c5b01 req-cc57461f-75b0-4135-9620-b5ff7a37451f 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "86c73a1d-eb82-4ab9-9714-0a0dc3f57225-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:05:26 np0005546954 nova_compute[187160]: 2025-12-05 13:05:26.400 187164 DEBUG oslo_concurrency.lockutils [req-328e7f9b-2ef3-4f7a-9631-6cb9ee6c5b01 req-cc57461f-75b0-4135-9620-b5ff7a37451f 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "86c73a1d-eb82-4ab9-9714-0a0dc3f57225-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:05:26 np0005546954 nova_compute[187160]: 2025-12-05 13:05:26.400 187164 DEBUG oslo_concurrency.lockutils [req-328e7f9b-2ef3-4f7a-9631-6cb9ee6c5b01 req-cc57461f-75b0-4135-9620-b5ff7a37451f 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "86c73a1d-eb82-4ab9-9714-0a0dc3f57225-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:05:26 np0005546954 nova_compute[187160]: 2025-12-05 13:05:26.400 187164 DEBUG nova.compute.manager [req-328e7f9b-2ef3-4f7a-9631-6cb9ee6c5b01 req-cc57461f-75b0-4135-9620-b5ff7a37451f 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 86c73a1d-eb82-4ab9-9714-0a0dc3f57225] No waiting events found dispatching network-vif-unplugged-0bfa1c75-3860-493b-8ae8-1b6935a9b91a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 08:05:26 np0005546954 nova_compute[187160]: 2025-12-05 13:05:26.401 187164 DEBUG nova.compute.manager [req-328e7f9b-2ef3-4f7a-9631-6cb9ee6c5b01 req-cc57461f-75b0-4135-9620-b5ff7a37451f 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 86c73a1d-eb82-4ab9-9714-0a0dc3f57225] Received event network-vif-unplugged-0bfa1c75-3860-493b-8ae8-1b6935a9b91a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  5 08:05:26 np0005546954 nova_compute[187160]: 2025-12-05 13:05:26.401 187164 DEBUG nova.compute.manager [req-328e7f9b-2ef3-4f7a-9631-6cb9ee6c5b01 req-cc57461f-75b0-4135-9620-b5ff7a37451f 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 86c73a1d-eb82-4ab9-9714-0a0dc3f57225] Received event network-vif-plugged-0bfa1c75-3860-493b-8ae8-1b6935a9b91a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:05:26 np0005546954 nova_compute[187160]: 2025-12-05 13:05:26.401 187164 DEBUG oslo_concurrency.lockutils [req-328e7f9b-2ef3-4f7a-9631-6cb9ee6c5b01 req-cc57461f-75b0-4135-9620-b5ff7a37451f 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "86c73a1d-eb82-4ab9-9714-0a0dc3f57225-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:05:26 np0005546954 nova_compute[187160]: 2025-12-05 13:05:26.401 187164 DEBUG oslo_concurrency.lockutils [req-328e7f9b-2ef3-4f7a-9631-6cb9ee6c5b01 req-cc57461f-75b0-4135-9620-b5ff7a37451f 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "86c73a1d-eb82-4ab9-9714-0a0dc3f57225-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:05:26 np0005546954 nova_compute[187160]: 2025-12-05 13:05:26.402 187164 DEBUG oslo_concurrency.lockutils [req-328e7f9b-2ef3-4f7a-9631-6cb9ee6c5b01 req-cc57461f-75b0-4135-9620-b5ff7a37451f 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "86c73a1d-eb82-4ab9-9714-0a0dc3f57225-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:05:26 np0005546954 nova_compute[187160]: 2025-12-05 13:05:26.402 187164 DEBUG nova.compute.manager [req-328e7f9b-2ef3-4f7a-9631-6cb9ee6c5b01 req-cc57461f-75b0-4135-9620-b5ff7a37451f 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 86c73a1d-eb82-4ab9-9714-0a0dc3f57225] No waiting events found dispatching network-vif-plugged-0bfa1c75-3860-493b-8ae8-1b6935a9b91a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 08:05:26 np0005546954 nova_compute[187160]: 2025-12-05 13:05:26.402 187164 WARNING nova.compute.manager [req-328e7f9b-2ef3-4f7a-9631-6cb9ee6c5b01 req-cc57461f-75b0-4135-9620-b5ff7a37451f 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 86c73a1d-eb82-4ab9-9714-0a0dc3f57225] Received unexpected event network-vif-plugged-0bfa1c75-3860-493b-8ae8-1b6935a9b91a for instance with vm_state active and task_state deleting.#033[00m
Dec  5 08:05:26 np0005546954 nova_compute[187160]: 2025-12-05 13:05:26.910 187164 DEBUG nova.network.neutron [-] [instance: 86c73a1d-eb82-4ab9-9714-0a0dc3f57225] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 08:05:26 np0005546954 nova_compute[187160]: 2025-12-05 13:05:26.945 187164 INFO nova.compute.manager [-] [instance: 86c73a1d-eb82-4ab9-9714-0a0dc3f57225] Took 1.31 seconds to deallocate network for instance.#033[00m
Dec  5 08:05:26 np0005546954 nova_compute[187160]: 2025-12-05 13:05:26.987 187164 DEBUG oslo_concurrency.lockutils [None req-52106e5d-af1e-412f-8732-c209ef3d5359 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:05:26 np0005546954 nova_compute[187160]: 2025-12-05 13:05:26.987 187164 DEBUG oslo_concurrency.lockutils [None req-52106e5d-af1e-412f-8732-c209ef3d5359 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:05:27 np0005546954 nova_compute[187160]: 2025-12-05 13:05:27.039 187164 DEBUG nova.compute.provider_tree [None req-52106e5d-af1e-412f-8732-c209ef3d5359 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 08:05:27 np0005546954 nova_compute[187160]: 2025-12-05 13:05:27.057 187164 DEBUG nova.scheduler.client.report [None req-52106e5d-af1e-412f-8732-c209ef3d5359 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 08:05:27 np0005546954 nova_compute[187160]: 2025-12-05 13:05:27.092 187164 DEBUG oslo_concurrency.lockutils [None req-52106e5d-af1e-412f-8732-c209ef3d5359 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:05:27 np0005546954 nova_compute[187160]: 2025-12-05 13:05:27.284 187164 INFO nova.scheduler.client.report [None req-52106e5d-af1e-412f-8732-c209ef3d5359 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Deleted allocations for instance 86c73a1d-eb82-4ab9-9714-0a0dc3f57225#033[00m
Dec  5 08:05:27 np0005546954 nova_compute[187160]: 2025-12-05 13:05:27.364 187164 DEBUG oslo_concurrency.lockutils [None req-52106e5d-af1e-412f-8732-c209ef3d5359 0ae0bb20ac8b4be99eb1abddc7310436 e6ae0d0dcde04b85b6dae45560cca988 - - default default] Lock "86c73a1d-eb82-4ab9-9714-0a0dc3f57225" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.060s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:05:28 np0005546954 nova_compute[187160]: 2025-12-05 13:05:28.522 187164 DEBUG nova.compute.manager [req-78af4703-380c-43e0-9a65-4a9f32ccdb9f req-05eb01ab-6138-4941-81cb-70840c60c5cd 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 86c73a1d-eb82-4ab9-9714-0a0dc3f57225] Received event network-vif-deleted-0bfa1c75-3860-493b-8ae8-1b6935a9b91a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:05:30 np0005546954 nova_compute[187160]: 2025-12-05 13:05:30.583 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:31 np0005546954 nova_compute[187160]: 2025-12-05 13:05:31.115 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:31 np0005546954 podman[217487]: 2025-12-05 13:05:31.557083478 +0000 UTC m=+0.066167179 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec  5 08:05:31 np0005546954 podman[217486]: 2025-12-05 13:05:31.557502061 +0000 UTC m=+0.069133340 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, name=ubi9-minimal, release=1755695350, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter)
Dec  5 08:05:35 np0005546954 nova_compute[187160]: 2025-12-05 13:05:35.586 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:35 np0005546954 podman[197513]: time="2025-12-05T13:05:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:05:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:05:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 08:05:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:05:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2596 "" "Go-http-client/1.1"
Dec  5 08:05:36 np0005546954 nova_compute[187160]: 2025-12-05 13:05:36.117 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:37 np0005546954 nova_compute[187160]: 2025-12-05 13:05:37.399 187164 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764939922.3975337, 76812e6f-bda5-495b-be99-2ff8c5960729 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 08:05:37 np0005546954 nova_compute[187160]: 2025-12-05 13:05:37.399 187164 INFO nova.compute.manager [-] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] VM Stopped (Lifecycle Event)#033[00m
Dec  5 08:05:37 np0005546954 nova_compute[187160]: 2025-12-05 13:05:37.422 187164 DEBUG nova.compute.manager [None req-66d922c3-d418-41a1-89a4-0db8e4ecb40f - - - - - -] [instance: 76812e6f-bda5-495b-be99-2ff8c5960729] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 08:05:40 np0005546954 nova_compute[187160]: 2025-12-05 13:05:40.563 187164 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764939925.5627093, 86c73a1d-eb82-4ab9-9714-0a0dc3f57225 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 08:05:40 np0005546954 nova_compute[187160]: 2025-12-05 13:05:40.563 187164 INFO nova.compute.manager [-] [instance: 86c73a1d-eb82-4ab9-9714-0a0dc3f57225] VM Stopped (Lifecycle Event)#033[00m
Dec  5 08:05:40 np0005546954 nova_compute[187160]: 2025-12-05 13:05:40.587 187164 DEBUG nova.compute.manager [None req-b0bce467-3c40-44c6-8c96-e49c10ad84b1 - - - - - -] [instance: 86c73a1d-eb82-4ab9-9714-0a0dc3f57225] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 08:05:40 np0005546954 nova_compute[187160]: 2025-12-05 13:05:40.590 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:41 np0005546954 nova_compute[187160]: 2025-12-05 13:05:41.119 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:45 np0005546954 podman[217527]: 2025-12-05 13:05:45.540818886 +0000 UTC m=+0.053363592 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec  5 08:05:45 np0005546954 nova_compute[187160]: 2025-12-05 13:05:45.591 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:46 np0005546954 nova_compute[187160]: 2025-12-05 13:05:46.122 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:05:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:05:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:05:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:05:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:05:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:05:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:05:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:05:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:05:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:05:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:05:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:05:50 np0005546954 podman[217548]: 2025-12-05 13:05:50.535040088 +0000 UTC m=+0.048961475 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 08:05:50 np0005546954 podman[217547]: 2025-12-05 13:05:50.560362022 +0000 UTC m=+0.078289404 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  5 08:05:50 np0005546954 nova_compute[187160]: 2025-12-05 13:05:50.593 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:51 np0005546954 nova_compute[187160]: 2025-12-05 13:05:51.124 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:55 np0005546954 nova_compute[187160]: 2025-12-05 13:05:55.596 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:56 np0005546954 nova_compute[187160]: 2025-12-05 13:05:56.126 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:05:56 np0005546954 ovn_controller[95566]: 2025-12-05T13:05:56Z|00234|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Dec  5 08:06:00 np0005546954 nova_compute[187160]: 2025-12-05 13:06:00.599 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:06:01 np0005546954 nova_compute[187160]: 2025-12-05 13:06:01.127 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:06:02 np0005546954 podman[217594]: 2025-12-05 13:06:02.549935461 +0000 UTC m=+0.061317939 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, name=ubi9-minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, io.buildah.version=1.33.7, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.openshift.expose-services=, managed_by=edpm_ansible, vcs-type=git, version=9.6, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41)
Dec  5 08:06:02 np0005546954 podman[217595]: 2025-12-05 13:06:02.568429653 +0000 UTC m=+0.062584288 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec  5 08:06:05 np0005546954 nova_compute[187160]: 2025-12-05 13:06:05.113 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:06:05 np0005546954 nova_compute[187160]: 2025-12-05 13:06:05.602 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:06:05 np0005546954 podman[197513]: time="2025-12-05T13:06:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:06:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:06:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 08:06:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:06:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2596 "" "Go-http-client/1.1"
Dec  5 08:06:06 np0005546954 nova_compute[187160]: 2025-12-05 13:06:06.130 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:06:10 np0005546954 nova_compute[187160]: 2025-12-05 13:06:10.603 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:06:11 np0005546954 nova_compute[187160]: 2025-12-05 13:06:11.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:06:11 np0005546954 nova_compute[187160]: 2025-12-05 13:06:11.159 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:06:13 np0005546954 nova_compute[187160]: 2025-12-05 13:06:13.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:06:13 np0005546954 nova_compute[187160]: 2025-12-05 13:06:13.039 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 08:06:13 np0005546954 nova_compute[187160]: 2025-12-05 13:06:13.039 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 08:06:13 np0005546954 nova_compute[187160]: 2025-12-05 13:06:13.054 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 08:06:13 np0005546954 nova_compute[187160]: 2025-12-05 13:06:13.054 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:06:15 np0005546954 nova_compute[187160]: 2025-12-05 13:06:15.605 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:06:16 np0005546954 nova_compute[187160]: 2025-12-05 13:06:16.162 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:06:16 np0005546954 podman[217636]: 2025-12-05 13:06:16.54937922 +0000 UTC m=+0.058113211 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  5 08:06:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:06:16.970 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:06:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:06:16.971 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:06:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:06:16.971 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:06:17 np0005546954 nova_compute[187160]: 2025-12-05 13:06:17.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:06:17 np0005546954 nova_compute[187160]: 2025-12-05 13:06:17.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:06:18 np0005546954 nova_compute[187160]: 2025-12-05 13:06:18.034 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:06:18 np0005546954 nova_compute[187160]: 2025-12-05 13:06:18.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:06:18 np0005546954 nova_compute[187160]: 2025-12-05 13:06:18.063 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:06:18 np0005546954 nova_compute[187160]: 2025-12-05 13:06:18.063 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:06:18 np0005546954 nova_compute[187160]: 2025-12-05 13:06:18.063 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:06:18 np0005546954 nova_compute[187160]: 2025-12-05 13:06:18.063 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 08:06:18 np0005546954 nova_compute[187160]: 2025-12-05 13:06:18.233 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 08:06:18 np0005546954 nova_compute[187160]: 2025-12-05 13:06:18.234 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5876MB free_disk=73.3295783996582GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 08:06:18 np0005546954 nova_compute[187160]: 2025-12-05 13:06:18.235 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:06:18 np0005546954 nova_compute[187160]: 2025-12-05 13:06:18.235 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:06:18 np0005546954 nova_compute[187160]: 2025-12-05 13:06:18.313 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 08:06:18 np0005546954 nova_compute[187160]: 2025-12-05 13:06:18.314 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 08:06:18 np0005546954 nova_compute[187160]: 2025-12-05 13:06:18.334 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 08:06:18 np0005546954 nova_compute[187160]: 2025-12-05 13:06:18.347 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 08:06:18 np0005546954 nova_compute[187160]: 2025-12-05 13:06:18.366 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 08:06:18 np0005546954 nova_compute[187160]: 2025-12-05 13:06:18.366 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:06:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:06:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:06:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:06:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:06:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:06:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:06:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:06:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:06:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:06:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:06:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:06:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:06:20 np0005546954 nova_compute[187160]: 2025-12-05 13:06:20.608 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:06:21 np0005546954 nova_compute[187160]: 2025-12-05 13:06:21.163 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:06:21 np0005546954 nova_compute[187160]: 2025-12-05 13:06:21.367 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:06:21 np0005546954 nova_compute[187160]: 2025-12-05 13:06:21.367 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 08:06:21 np0005546954 podman[217659]: 2025-12-05 13:06:21.563957355 +0000 UTC m=+0.050144674 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 08:06:21 np0005546954 podman[217658]: 2025-12-05 13:06:21.615590203 +0000 UTC m=+0.115755995 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec  5 08:06:24 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:06:24.246 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2a:56:46', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '32:90:88:ab:74:32'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 08:06:24 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:06:24.247 104428 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 08:06:24 np0005546954 nova_compute[187160]: 2025-12-05 13:06:24.285 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:06:25 np0005546954 nova_compute[187160]: 2025-12-05 13:06:25.610 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:06:26 np0005546954 nova_compute[187160]: 2025-12-05 13:06:26.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:06:26 np0005546954 nova_compute[187160]: 2025-12-05 13:06:26.166 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:06:30 np0005546954 nova_compute[187160]: 2025-12-05 13:06:30.612 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:06:31 np0005546954 nova_compute[187160]: 2025-12-05 13:06:31.167 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:06:32 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:06:32.251 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f9f74c-08f9-451f-9678-93bb9e8fa2fe, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:06:33 np0005546954 podman[217707]: 2025-12-05 13:06:33.548189373 +0000 UTC m=+0.053491999 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec  5 08:06:33 np0005546954 podman[217706]: 2025-12-05 13:06:33.558188283 +0000 UTC m=+0.067185753 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_id=edpm, distribution-scope=public, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec  5 08:06:35 np0005546954 nova_compute[187160]: 2025-12-05 13:06:35.614 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:06:35 np0005546954 podman[197513]: time="2025-12-05T13:06:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:06:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:06:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 08:06:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:06:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2596 "" "Go-http-client/1.1"
Dec  5 08:06:36 np0005546954 nova_compute[187160]: 2025-12-05 13:06:36.168 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:06:38 np0005546954 ovn_controller[95566]: 2025-12-05T13:06:38Z|00235|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Dec  5 08:06:40 np0005546954 nova_compute[187160]: 2025-12-05 13:06:40.654 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:06:41 np0005546954 nova_compute[187160]: 2025-12-05 13:06:41.171 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:06:42 np0005546954 nova_compute[187160]: 2025-12-05 13:06:42.002 187164 DEBUG oslo_concurrency.lockutils [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Acquiring lock "ff3a12a0-9d4f-4501-99cd-dc404a5e80b7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:06:42 np0005546954 nova_compute[187160]: 2025-12-05 13:06:42.003 187164 DEBUG oslo_concurrency.lockutils [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Lock "ff3a12a0-9d4f-4501-99cd-dc404a5e80b7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:06:42 np0005546954 nova_compute[187160]: 2025-12-05 13:06:42.112 187164 DEBUG nova.compute.manager [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 08:06:42 np0005546954 nova_compute[187160]: 2025-12-05 13:06:42.301 187164 DEBUG oslo_concurrency.lockutils [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:06:42 np0005546954 nova_compute[187160]: 2025-12-05 13:06:42.302 187164 DEBUG oslo_concurrency.lockutils [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:06:42 np0005546954 nova_compute[187160]: 2025-12-05 13:06:42.311 187164 DEBUG nova.virt.hardware [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 08:06:42 np0005546954 nova_compute[187160]: 2025-12-05 13:06:42.312 187164 INFO nova.compute.claims [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Claim successful on node compute-1.ctlplane.example.com#033[00m
Dec  5 08:06:42 np0005546954 nova_compute[187160]: 2025-12-05 13:06:42.428 187164 DEBUG nova.compute.provider_tree [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 08:06:42 np0005546954 nova_compute[187160]: 2025-12-05 13:06:42.452 187164 DEBUG nova.scheduler.client.report [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 08:06:42 np0005546954 nova_compute[187160]: 2025-12-05 13:06:42.476 187164 DEBUG oslo_concurrency.lockutils [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:06:42 np0005546954 nova_compute[187160]: 2025-12-05 13:06:42.477 187164 DEBUG nova.compute.manager [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 08:06:42 np0005546954 nova_compute[187160]: 2025-12-05 13:06:42.541 187164 DEBUG nova.compute.manager [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 08:06:42 np0005546954 nova_compute[187160]: 2025-12-05 13:06:42.542 187164 DEBUG nova.network.neutron [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 08:06:42 np0005546954 nova_compute[187160]: 2025-12-05 13:06:42.570 187164 INFO nova.virt.libvirt.driver [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 08:06:42 np0005546954 nova_compute[187160]: 2025-12-05 13:06:42.589 187164 DEBUG nova.compute.manager [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 08:06:42 np0005546954 nova_compute[187160]: 2025-12-05 13:06:42.684 187164 DEBUG nova.compute.manager [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 08:06:42 np0005546954 nova_compute[187160]: 2025-12-05 13:06:42.685 187164 DEBUG nova.virt.libvirt.driver [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 08:06:42 np0005546954 nova_compute[187160]: 2025-12-05 13:06:42.686 187164 INFO nova.virt.libvirt.driver [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Creating image(s)#033[00m
Dec  5 08:06:42 np0005546954 nova_compute[187160]: 2025-12-05 13:06:42.687 187164 DEBUG oslo_concurrency.lockutils [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Acquiring lock "/var/lib/nova/instances/ff3a12a0-9d4f-4501-99cd-dc404a5e80b7/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:06:42 np0005546954 nova_compute[187160]: 2025-12-05 13:06:42.687 187164 DEBUG oslo_concurrency.lockutils [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Lock "/var/lib/nova/instances/ff3a12a0-9d4f-4501-99cd-dc404a5e80b7/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:06:42 np0005546954 nova_compute[187160]: 2025-12-05 13:06:42.688 187164 DEBUG oslo_concurrency.lockutils [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Lock "/var/lib/nova/instances/ff3a12a0-9d4f-4501-99cd-dc404a5e80b7/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:06:42 np0005546954 nova_compute[187160]: 2025-12-05 13:06:42.703 187164 DEBUG oslo_concurrency.processutils [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:06:42 np0005546954 nova_compute[187160]: 2025-12-05 13:06:42.775 187164 DEBUG oslo_concurrency.processutils [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:06:42 np0005546954 nova_compute[187160]: 2025-12-05 13:06:42.776 187164 DEBUG oslo_concurrency.lockutils [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Acquiring lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:06:42 np0005546954 nova_compute[187160]: 2025-12-05 13:06:42.777 187164 DEBUG oslo_concurrency.lockutils [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:06:42 np0005546954 nova_compute[187160]: 2025-12-05 13:06:42.791 187164 DEBUG oslo_concurrency.processutils [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:06:42 np0005546954 nova_compute[187160]: 2025-12-05 13:06:42.845 187164 DEBUG oslo_concurrency.processutils [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:06:42 np0005546954 nova_compute[187160]: 2025-12-05 13:06:42.846 187164 DEBUG oslo_concurrency.processutils [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/ff3a12a0-9d4f-4501-99cd-dc404a5e80b7/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:06:42 np0005546954 nova_compute[187160]: 2025-12-05 13:06:42.877 187164 DEBUG oslo_concurrency.processutils [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/ff3a12a0-9d4f-4501-99cd-dc404a5e80b7/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:06:42 np0005546954 nova_compute[187160]: 2025-12-05 13:06:42.878 187164 DEBUG oslo_concurrency.lockutils [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:06:42 np0005546954 nova_compute[187160]: 2025-12-05 13:06:42.879 187164 DEBUG oslo_concurrency.processutils [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:06:42 np0005546954 nova_compute[187160]: 2025-12-05 13:06:42.932 187164 DEBUG oslo_concurrency.processutils [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:06:42 np0005546954 nova_compute[187160]: 2025-12-05 13:06:42.933 187164 DEBUG nova.virt.disk.api [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Checking if we can resize image /var/lib/nova/instances/ff3a12a0-9d4f-4501-99cd-dc404a5e80b7/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 08:06:42 np0005546954 nova_compute[187160]: 2025-12-05 13:06:42.934 187164 DEBUG oslo_concurrency.processutils [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff3a12a0-9d4f-4501-99cd-dc404a5e80b7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:06:42 np0005546954 nova_compute[187160]: 2025-12-05 13:06:42.989 187164 DEBUG nova.policy [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2ac3cc621ecc4823aed54d0815090a78', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cd8844cdffbc42fda56f46bb649ff60d', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 08:06:43 np0005546954 nova_compute[187160]: 2025-12-05 13:06:43.026 187164 DEBUG oslo_concurrency.processutils [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff3a12a0-9d4f-4501-99cd-dc404a5e80b7/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:06:43 np0005546954 nova_compute[187160]: 2025-12-05 13:06:43.027 187164 DEBUG nova.virt.disk.api [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Cannot resize image /var/lib/nova/instances/ff3a12a0-9d4f-4501-99cd-dc404a5e80b7/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 08:06:43 np0005546954 nova_compute[187160]: 2025-12-05 13:06:43.027 187164 DEBUG nova.objects.instance [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Lazy-loading 'migration_context' on Instance uuid ff3a12a0-9d4f-4501-99cd-dc404a5e80b7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 08:06:43 np0005546954 nova_compute[187160]: 2025-12-05 13:06:43.042 187164 DEBUG nova.virt.libvirt.driver [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 08:06:43 np0005546954 nova_compute[187160]: 2025-12-05 13:06:43.043 187164 DEBUG nova.virt.libvirt.driver [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Ensure instance console log exists: /var/lib/nova/instances/ff3a12a0-9d4f-4501-99cd-dc404a5e80b7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 08:06:43 np0005546954 nova_compute[187160]: 2025-12-05 13:06:43.043 187164 DEBUG oslo_concurrency.lockutils [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:06:43 np0005546954 nova_compute[187160]: 2025-12-05 13:06:43.044 187164 DEBUG oslo_concurrency.lockutils [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:06:43 np0005546954 nova_compute[187160]: 2025-12-05 13:06:43.044 187164 DEBUG oslo_concurrency.lockutils [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:06:44 np0005546954 nova_compute[187160]: 2025-12-05 13:06:44.207 187164 DEBUG nova.network.neutron [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Successfully created port: 3e75686e-2bb1-41fe-b540-d4013058afc1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 08:06:45 np0005546954 nova_compute[187160]: 2025-12-05 13:06:45.044 187164 DEBUG nova.network.neutron [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Successfully updated port: 3e75686e-2bb1-41fe-b540-d4013058afc1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 08:06:45 np0005546954 nova_compute[187160]: 2025-12-05 13:06:45.060 187164 DEBUG oslo_concurrency.lockutils [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Acquiring lock "refresh_cache-ff3a12a0-9d4f-4501-99cd-dc404a5e80b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 08:06:45 np0005546954 nova_compute[187160]: 2025-12-05 13:06:45.061 187164 DEBUG oslo_concurrency.lockutils [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Acquired lock "refresh_cache-ff3a12a0-9d4f-4501-99cd-dc404a5e80b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 08:06:45 np0005546954 nova_compute[187160]: 2025-12-05 13:06:45.061 187164 DEBUG nova.network.neutron [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 08:06:45 np0005546954 nova_compute[187160]: 2025-12-05 13:06:45.136 187164 DEBUG nova.compute.manager [req-732aefb7-efc8-4ccb-9e04-eeb22d845d3e req-1e86e317-f93e-4f72-9afc-7abc548e41cc 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Received event network-changed-3e75686e-2bb1-41fe-b540-d4013058afc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:06:45 np0005546954 nova_compute[187160]: 2025-12-05 13:06:45.136 187164 DEBUG nova.compute.manager [req-732aefb7-efc8-4ccb-9e04-eeb22d845d3e req-1e86e317-f93e-4f72-9afc-7abc548e41cc 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Refreshing instance network info cache due to event network-changed-3e75686e-2bb1-41fe-b540-d4013058afc1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 08:06:45 np0005546954 nova_compute[187160]: 2025-12-05 13:06:45.136 187164 DEBUG oslo_concurrency.lockutils [req-732aefb7-efc8-4ccb-9e04-eeb22d845d3e req-1e86e317-f93e-4f72-9afc-7abc548e41cc 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "refresh_cache-ff3a12a0-9d4f-4501-99cd-dc404a5e80b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 08:06:45 np0005546954 nova_compute[187160]: 2025-12-05 13:06:45.212 187164 DEBUG nova.network.neutron [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 08:06:45 np0005546954 nova_compute[187160]: 2025-12-05 13:06:45.656 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:06:45 np0005546954 nova_compute[187160]: 2025-12-05 13:06:45.977 187164 DEBUG nova.network.neutron [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Updating instance_info_cache with network_info: [{"id": "3e75686e-2bb1-41fe-b540-d4013058afc1", "address": "fa:16:3e:c3:c2:e7", "network": {"id": "f58a1180-800c-4113-a42d-53ba7a83ff14", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-2054968628-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cd8844cdffbc42fda56f46bb649ff60d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e75686e-2b", "ovs_interfaceid": "3e75686e-2bb1-41fe-b540-d4013058afc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 08:06:45 np0005546954 nova_compute[187160]: 2025-12-05 13:06:45.994 187164 DEBUG oslo_concurrency.lockutils [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Releasing lock "refresh_cache-ff3a12a0-9d4f-4501-99cd-dc404a5e80b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 08:06:45 np0005546954 nova_compute[187160]: 2025-12-05 13:06:45.994 187164 DEBUG nova.compute.manager [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Instance network_info: |[{"id": "3e75686e-2bb1-41fe-b540-d4013058afc1", "address": "fa:16:3e:c3:c2:e7", "network": {"id": "f58a1180-800c-4113-a42d-53ba7a83ff14", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-2054968628-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cd8844cdffbc42fda56f46bb649ff60d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e75686e-2b", "ovs_interfaceid": "3e75686e-2bb1-41fe-b540-d4013058afc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 08:06:45 np0005546954 nova_compute[187160]: 2025-12-05 13:06:45.995 187164 DEBUG oslo_concurrency.lockutils [req-732aefb7-efc8-4ccb-9e04-eeb22d845d3e req-1e86e317-f93e-4f72-9afc-7abc548e41cc 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquired lock "refresh_cache-ff3a12a0-9d4f-4501-99cd-dc404a5e80b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 08:06:45 np0005546954 nova_compute[187160]: 2025-12-05 13:06:45.995 187164 DEBUG nova.network.neutron [req-732aefb7-efc8-4ccb-9e04-eeb22d845d3e req-1e86e317-f93e-4f72-9afc-7abc548e41cc 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Refreshing network info cache for port 3e75686e-2bb1-41fe-b540-d4013058afc1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 08:06:45 np0005546954 nova_compute[187160]: 2025-12-05 13:06:45.997 187164 DEBUG nova.virt.libvirt.driver [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Start _get_guest_xml network_info=[{"id": "3e75686e-2bb1-41fe-b540-d4013058afc1", "address": "fa:16:3e:c3:c2:e7", "network": {"id": "f58a1180-800c-4113-a42d-53ba7a83ff14", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-2054968628-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cd8844cdffbc42fda56f46bb649ff60d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e75686e-2b", "ovs_interfaceid": "3e75686e-2bb1-41fe-b540-d4013058afc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T12:39:17Z,direct_url=<?>,disk_format='qcow2',id=f4c3125a-6fd0-40bb-aa00-a7e736ee853d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='83916c53de6f404f91206339303e1b23',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T12:39:21Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'size': 0, 'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'encrypted': False, 'image_id': 'f4c3125a-6fd0-40bb-aa00-a7e736ee853d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.001 187164 WARNING nova.virt.libvirt.driver [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.007 187164 DEBUG nova.virt.libvirt.host [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.007 187164 DEBUG nova.virt.libvirt.host [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.011 187164 DEBUG nova.virt.libvirt.host [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.011 187164 DEBUG nova.virt.libvirt.host [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.012 187164 DEBUG nova.virt.libvirt.driver [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.012 187164 DEBUG nova.virt.hardware [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T12:39:15Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b4ea63be-97f8-4a48-b000-66321c4ddb27',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T12:39:17Z,direct_url=<?>,disk_format='qcow2',id=f4c3125a-6fd0-40bb-aa00-a7e736ee853d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='83916c53de6f404f91206339303e1b23',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T12:39:21Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.013 187164 DEBUG nova.virt.hardware [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.013 187164 DEBUG nova.virt.hardware [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.013 187164 DEBUG nova.virt.hardware [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.013 187164 DEBUG nova.virt.hardware [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.014 187164 DEBUG nova.virt.hardware [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.014 187164 DEBUG nova.virt.hardware [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.014 187164 DEBUG nova.virt.hardware [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.014 187164 DEBUG nova.virt.hardware [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.014 187164 DEBUG nova.virt.hardware [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.015 187164 DEBUG nova.virt.hardware [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.018 187164 DEBUG nova.virt.libvirt.vif [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T13:06:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1286551380',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1286551380',id=25,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cd8844cdffbc42fda56f46bb649ff60d',ramdisk_id='',reservation_id='r-jxt8u7pv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1594092090',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1594092090-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T13:06:42Z,user_data=None,user_id='2ac3cc621ecc4823aed54d0815090a78',uuid=ff3a12a0-9d4f-4501-99cd-dc404a5e80b7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3e75686e-2bb1-41fe-b540-d4013058afc1", "address": "fa:16:3e:c3:c2:e7", "network": {"id": "f58a1180-800c-4113-a42d-53ba7a83ff14", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-2054968628-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cd8844cdffbc42fda56f46bb649ff60d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e75686e-2b", "ovs_interfaceid": "3e75686e-2bb1-41fe-b540-d4013058afc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.019 187164 DEBUG nova.network.os_vif_util [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Converting VIF {"id": "3e75686e-2bb1-41fe-b540-d4013058afc1", "address": "fa:16:3e:c3:c2:e7", "network": {"id": "f58a1180-800c-4113-a42d-53ba7a83ff14", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-2054968628-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cd8844cdffbc42fda56f46bb649ff60d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e75686e-2b", "ovs_interfaceid": "3e75686e-2bb1-41fe-b540-d4013058afc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.019 187164 DEBUG nova.network.os_vif_util [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c3:c2:e7,bridge_name='br-int',has_traffic_filtering=True,id=3e75686e-2bb1-41fe-b540-d4013058afc1,network=Network(f58a1180-800c-4113-a42d-53ba7a83ff14),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e75686e-2b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.020 187164 DEBUG nova.objects.instance [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Lazy-loading 'pci_devices' on Instance uuid ff3a12a0-9d4f-4501-99cd-dc404a5e80b7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.138 187164 DEBUG nova.virt.libvirt.driver [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] End _get_guest_xml xml=<domain type="kvm">
Dec  5 08:06:46 np0005546954 nova_compute[187160]:  <uuid>ff3a12a0-9d4f-4501-99cd-dc404a5e80b7</uuid>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:  <name>instance-00000019</name>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:  <memory>131072</memory>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:  <vcpu>1</vcpu>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:  <metadata>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 08:06:46 np0005546954 nova_compute[187160]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:      <nova:name>tempest-TestExecuteVmWorkloadBalanceStrategy-server-1286551380</nova:name>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:      <nova:creationTime>2025-12-05 13:06:46</nova:creationTime>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:      <nova:flavor name="m1.nano">
Dec  5 08:06:46 np0005546954 nova_compute[187160]:        <nova:memory>128</nova:memory>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:        <nova:disk>1</nova:disk>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:        <nova:swap>0</nova:swap>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:        <nova:vcpus>1</nova:vcpus>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:      </nova:flavor>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:      <nova:owner>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:        <nova:user uuid="2ac3cc621ecc4823aed54d0815090a78">tempest-TestExecuteVmWorkloadBalanceStrategy-1594092090-project-member</nova:user>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:        <nova:project uuid="cd8844cdffbc42fda56f46bb649ff60d">tempest-TestExecuteVmWorkloadBalanceStrategy-1594092090</nova:project>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:      </nova:owner>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:      <nova:root type="image" uuid="f4c3125a-6fd0-40bb-aa00-a7e736ee853d"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:      <nova:ports>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:        <nova:port uuid="3e75686e-2bb1-41fe-b540-d4013058afc1">
Dec  5 08:06:46 np0005546954 nova_compute[187160]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:        </nova:port>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:      </nova:ports>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    </nova:instance>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:  </metadata>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:  <sysinfo type="smbios">
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <system>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:      <entry name="manufacturer">RDO</entry>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:      <entry name="product">OpenStack Compute</entry>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:      <entry name="serial">ff3a12a0-9d4f-4501-99cd-dc404a5e80b7</entry>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:      <entry name="uuid">ff3a12a0-9d4f-4501-99cd-dc404a5e80b7</entry>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:      <entry name="family">Virtual Machine</entry>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    </system>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:  </sysinfo>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:  <os>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <boot dev="hd"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <smbios mode="sysinfo"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:  </os>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:  <features>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <acpi/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <apic/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <vmcoreinfo/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:  </features>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:  <clock offset="utc">
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <timer name="hpet" present="no"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:  </clock>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:  <cpu mode="custom" match="exact">
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <model>Nehalem</model>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:  </cpu>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:  <devices>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <disk type="file" device="disk">
Dec  5 08:06:46 np0005546954 nova_compute[187160]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:      <source file="/var/lib/nova/instances/ff3a12a0-9d4f-4501-99cd-dc404a5e80b7/disk"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:      <target dev="vda" bus="virtio"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    </disk>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <disk type="file" device="cdrom">
Dec  5 08:06:46 np0005546954 nova_compute[187160]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:      <source file="/var/lib/nova/instances/ff3a12a0-9d4f-4501-99cd-dc404a5e80b7/disk.config"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:      <target dev="sda" bus="sata"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    </disk>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <interface type="ethernet">
Dec  5 08:06:46 np0005546954 nova_compute[187160]:      <mac address="fa:16:3e:c3:c2:e7"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:      <model type="virtio"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:      <mtu size="1442"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:      <target dev="tap3e75686e-2b"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    </interface>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <serial type="pty">
Dec  5 08:06:46 np0005546954 nova_compute[187160]:      <log file="/var/lib/nova/instances/ff3a12a0-9d4f-4501-99cd-dc404a5e80b7/console.log" append="off"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    </serial>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <video>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:      <model type="virtio"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    </video>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <input type="tablet" bus="usb"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <rng model="virtio">
Dec  5 08:06:46 np0005546954 nova_compute[187160]:      <backend model="random">/dev/urandom</backend>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    </rng>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <controller type="usb" index="0"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    <memballoon model="virtio">
Dec  5 08:06:46 np0005546954 nova_compute[187160]:      <stats period="10"/>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:    </memballoon>
Dec  5 08:06:46 np0005546954 nova_compute[187160]:  </devices>
Dec  5 08:06:46 np0005546954 nova_compute[187160]: </domain>
Dec  5 08:06:46 np0005546954 nova_compute[187160]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.140 187164 DEBUG nova.compute.manager [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Preparing to wait for external event network-vif-plugged-3e75686e-2bb1-41fe-b540-d4013058afc1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.141 187164 DEBUG oslo_concurrency.lockutils [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Acquiring lock "ff3a12a0-9d4f-4501-99cd-dc404a5e80b7-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.141 187164 DEBUG oslo_concurrency.lockutils [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Lock "ff3a12a0-9d4f-4501-99cd-dc404a5e80b7-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.141 187164 DEBUG oslo_concurrency.lockutils [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Lock "ff3a12a0-9d4f-4501-99cd-dc404a5e80b7-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.142 187164 DEBUG nova.virt.libvirt.vif [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T13:06:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1286551380',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1286551380',id=25,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cd8844cdffbc42fda56f46bb649ff60d',ramdisk_id='',reservation_id='r-jxt8u7pv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1594092090',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1594092090-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T13:06:42Z,user_data=None,user_id='2ac3cc621ecc4823aed54d0815090a78',uuid=ff3a12a0-9d4f-4501-99cd-dc404a5e80b7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3e75686e-2bb1-41fe-b540-d4013058afc1", "address": "fa:16:3e:c3:c2:e7", "network": {"id": "f58a1180-800c-4113-a42d-53ba7a83ff14", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-2054968628-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cd8844cdffbc42fda56f46bb649ff60d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e75686e-2b", "ovs_interfaceid": "3e75686e-2bb1-41fe-b540-d4013058afc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.143 187164 DEBUG nova.network.os_vif_util [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Converting VIF {"id": "3e75686e-2bb1-41fe-b540-d4013058afc1", "address": "fa:16:3e:c3:c2:e7", "network": {"id": "f58a1180-800c-4113-a42d-53ba7a83ff14", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-2054968628-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cd8844cdffbc42fda56f46bb649ff60d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e75686e-2b", "ovs_interfaceid": "3e75686e-2bb1-41fe-b540-d4013058afc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.143 187164 DEBUG nova.network.os_vif_util [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c3:c2:e7,bridge_name='br-int',has_traffic_filtering=True,id=3e75686e-2bb1-41fe-b540-d4013058afc1,network=Network(f58a1180-800c-4113-a42d-53ba7a83ff14),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e75686e-2b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.144 187164 DEBUG os_vif [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c3:c2:e7,bridge_name='br-int',has_traffic_filtering=True,id=3e75686e-2bb1-41fe-b540-d4013058afc1,network=Network(f58a1180-800c-4113-a42d-53ba7a83ff14),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e75686e-2b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.145 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.145 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.146 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.149 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.150 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3e75686e-2b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.150 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3e75686e-2b, col_values=(('external_ids', {'iface-id': '3e75686e-2bb1-41fe-b540-d4013058afc1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c3:c2:e7', 'vm-uuid': 'ff3a12a0-9d4f-4501-99cd-dc404a5e80b7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.152 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:06:46 np0005546954 NetworkManager[55665]: <info>  [1764940006.1537] manager: (tap3e75686e-2b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/91)
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.154 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.161 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.162 187164 INFO os_vif [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c3:c2:e7,bridge_name='br-int',has_traffic_filtering=True,id=3e75686e-2bb1-41fe-b540-d4013058afc1,network=Network(f58a1180-800c-4113-a42d-53ba7a83ff14),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e75686e-2b')#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.173 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.268 187164 DEBUG nova.virt.libvirt.driver [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.269 187164 DEBUG nova.virt.libvirt.driver [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.270 187164 DEBUG nova.virt.libvirt.driver [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] No VIF found with MAC fa:16:3e:c3:c2:e7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.270 187164 INFO nova.virt.libvirt.driver [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Using config drive#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.597 187164 INFO nova.virt.libvirt.driver [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Creating config drive at /var/lib/nova/instances/ff3a12a0-9d4f-4501-99cd-dc404a5e80b7/disk.config#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.602 187164 DEBUG oslo_concurrency.processutils [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ff3a12a0-9d4f-4501-99cd-dc404a5e80b7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpp4uromv1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.724 187164 DEBUG oslo_concurrency.processutils [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ff3a12a0-9d4f-4501-99cd-dc404a5e80b7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpp4uromv1" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:06:46 np0005546954 kernel: tap3e75686e-2b: entered promiscuous mode
Dec  5 08:06:46 np0005546954 NetworkManager[55665]: <info>  [1764940006.8298] manager: (tap3e75686e-2b): new Tun device (/org/freedesktop/NetworkManager/Devices/92)
Dec  5 08:06:46 np0005546954 ovn_controller[95566]: 2025-12-05T13:06:46Z|00236|binding|INFO|Claiming lport 3e75686e-2bb1-41fe-b540-d4013058afc1 for this chassis.
Dec  5 08:06:46 np0005546954 ovn_controller[95566]: 2025-12-05T13:06:46Z|00237|binding|INFO|3e75686e-2bb1-41fe-b540-d4013058afc1: Claiming fa:16:3e:c3:c2:e7 10.100.0.7
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.834 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.839 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:06:46 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:06:46.855 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c3:c2:e7 10.100.0.7'], port_security=['fa:16:3e:c3:c2:e7 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'ff3a12a0-9d4f-4501-99cd-dc404a5e80b7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f58a1180-800c-4113-a42d-53ba7a83ff14', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cd8844cdffbc42fda56f46bb649ff60d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '58c01b8f-2f37-4998-87d2-d107cf040b9e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=466363b7-72e5-437f-abf8-f548593b34f9, chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=3e75686e-2bb1-41fe-b540-d4013058afc1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 08:06:46 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:06:46.858 104428 INFO neutron.agent.ovn.metadata.agent [-] Port 3e75686e-2bb1-41fe-b540-d4013058afc1 in datapath f58a1180-800c-4113-a42d-53ba7a83ff14 bound to our chassis#033[00m
Dec  5 08:06:46 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:06:46.860 104428 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f58a1180-800c-4113-a42d-53ba7a83ff14#033[00m
Dec  5 08:06:46 np0005546954 systemd-udevd[217787]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 08:06:46 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:06:46.871 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[76ad04c0-f45f-4e9d-b701-c52d6d3a3a64]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:06:46 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:06:46.875 104428 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf58a1180-81 in ovnmeta-f58a1180-800c-4113-a42d-53ba7a83ff14 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 08:06:46 np0005546954 NetworkManager[55665]: <info>  [1764940006.8797] device (tap3e75686e-2b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 08:06:46 np0005546954 NetworkManager[55665]: <info>  [1764940006.8819] device (tap3e75686e-2b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 08:06:46 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:06:46.883 208690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf58a1180-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 08:06:46 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:06:46.883 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[5c508d75-4e60-40db-a96b-c7e0878142f2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:06:46 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:06:46.886 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[d03c6706-52f4-47cc-9ca6-59c13ce370f8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:06:46 np0005546954 systemd-machined[153497]: New machine qemu-23-instance-00000019.
Dec  5 08:06:46 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:06:46.898 104542 DEBUG oslo.privsep.daemon [-] privsep: reply[d684ded9-cfa7-4b84-b290-1705f455e21b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.906 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:06:46 np0005546954 systemd[1]: Started Virtual Machine qemu-23-instance-00000019.
Dec  5 08:06:46 np0005546954 ovn_controller[95566]: 2025-12-05T13:06:46Z|00238|binding|INFO|Setting lport 3e75686e-2bb1-41fe-b540-d4013058afc1 ovn-installed in OVS
Dec  5 08:06:46 np0005546954 ovn_controller[95566]: 2025-12-05T13:06:46Z|00239|binding|INFO|Setting lport 3e75686e-2bb1-41fe-b540-d4013058afc1 up in Southbound
Dec  5 08:06:46 np0005546954 nova_compute[187160]: 2025-12-05 13:06:46.916 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:06:46 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:06:46.923 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[4137dd63-4363-456d-826e-ddb02aafec11]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:06:46 np0005546954 podman[217772]: 2025-12-05 13:06:46.930305214 +0000 UTC m=+0.106715216 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  5 08:06:46 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:06:46.954 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[deae555b-f7a2-414d-af55-0395bbd9174a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:06:46 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:06:46.959 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[ce00d33d-d719-4c40-9279-050004158aa7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:06:46 np0005546954 NetworkManager[55665]: <info>  [1764940006.9623] manager: (tapf58a1180-80): new Veth device (/org/freedesktop/NetworkManager/Devices/93)
Dec  5 08:06:46 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:06:46.993 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[92015f11-4fcb-4fda-8445-bc67327bfbfa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:06:46 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:06:46.996 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[756811fc-5b77-4716-be6f-c5f59d9e7555]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:06:47 np0005546954 NetworkManager[55665]: <info>  [1764940007.0153] device (tapf58a1180-80): carrier: link connected
Dec  5 08:06:47 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:06:47.022 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[14d75794-95e7-4bf5-b8f2-302ed2c1597d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:06:47 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:06:47.039 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[7cff80af-d7ab-488c-8cdb-87befdb3e641]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf58a1180-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:49:5d:63'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501364, 'reachable_time': 41115, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217831, 'error': None, 'target': 'ovnmeta-f58a1180-800c-4113-a42d-53ba7a83ff14', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:06:47 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:06:47.055 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[d2de660e-f6ff-4cb0-bba9-25194f38f94f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe49:5d63'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 501364, 'tstamp': 501364}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217832, 'error': None, 'target': 'ovnmeta-f58a1180-800c-4113-a42d-53ba7a83ff14', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:06:47 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:06:47.079 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[7a93accd-ba3c-4763-aca1-0caecf841ca5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf58a1180-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:49:5d:63'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501364, 'reachable_time': 41115, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217833, 'error': None, 'target': 'ovnmeta-f58a1180-800c-4113-a42d-53ba7a83ff14', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:06:47 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:06:47.117 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[1a949659-1f3a-4347-91e0-9e59a6e2533b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:06:47 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:06:47.196 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[ee6c7682-130a-4fd2-85c0-9386741e520f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:06:47 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:06:47.198 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf58a1180-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:06:47 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:06:47.198 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 08:06:47 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:06:47.198 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf58a1180-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:06:47 np0005546954 nova_compute[187160]: 2025-12-05 13:06:47.200 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:06:47 np0005546954 NetworkManager[55665]: <info>  [1764940007.2013] manager: (tapf58a1180-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/94)
Dec  5 08:06:47 np0005546954 kernel: tapf58a1180-80: entered promiscuous mode
Dec  5 08:06:47 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:06:47.203 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf58a1180-80, col_values=(('external_ids', {'iface-id': 'e6453cd1-5272-499c-b5ee-eab27e5bfe29'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:06:47 np0005546954 ovn_controller[95566]: 2025-12-05T13:06:47Z|00240|binding|INFO|Releasing lport e6453cd1-5272-499c-b5ee-eab27e5bfe29 from this chassis (sb_readonly=0)
Dec  5 08:06:47 np0005546954 nova_compute[187160]: 2025-12-05 13:06:47.215 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:06:47 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:06:47.216 104428 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f58a1180-800c-4113-a42d-53ba7a83ff14.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f58a1180-800c-4113-a42d-53ba7a83ff14.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 08:06:47 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:06:47.217 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[629df5f9-6938-4dc1-979a-38bd269c1ef0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:06:47 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:06:47.217 104428 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 08:06:47 np0005546954 ovn_metadata_agent[104423]: global
Dec  5 08:06:47 np0005546954 ovn_metadata_agent[104423]:    log         /dev/log local0 debug
Dec  5 08:06:47 np0005546954 ovn_metadata_agent[104423]:    log-tag     haproxy-metadata-proxy-f58a1180-800c-4113-a42d-53ba7a83ff14
Dec  5 08:06:47 np0005546954 ovn_metadata_agent[104423]:    user        root
Dec  5 08:06:47 np0005546954 ovn_metadata_agent[104423]:    group       root
Dec  5 08:06:47 np0005546954 ovn_metadata_agent[104423]:    maxconn     1024
Dec  5 08:06:47 np0005546954 ovn_metadata_agent[104423]:    pidfile     /var/lib/neutron/external/pids/f58a1180-800c-4113-a42d-53ba7a83ff14.pid.haproxy
Dec  5 08:06:47 np0005546954 ovn_metadata_agent[104423]:    daemon
Dec  5 08:06:47 np0005546954 ovn_metadata_agent[104423]: 
Dec  5 08:06:47 np0005546954 ovn_metadata_agent[104423]: defaults
Dec  5 08:06:47 np0005546954 ovn_metadata_agent[104423]:    log global
Dec  5 08:06:47 np0005546954 ovn_metadata_agent[104423]:    mode http
Dec  5 08:06:47 np0005546954 ovn_metadata_agent[104423]:    option httplog
Dec  5 08:06:47 np0005546954 ovn_metadata_agent[104423]:    option dontlognull
Dec  5 08:06:47 np0005546954 ovn_metadata_agent[104423]:    option http-server-close
Dec  5 08:06:47 np0005546954 ovn_metadata_agent[104423]:    option forwardfor
Dec  5 08:06:47 np0005546954 ovn_metadata_agent[104423]:    retries                 3
Dec  5 08:06:47 np0005546954 ovn_metadata_agent[104423]:    timeout http-request    30s
Dec  5 08:06:47 np0005546954 ovn_metadata_agent[104423]:    timeout connect         30s
Dec  5 08:06:47 np0005546954 ovn_metadata_agent[104423]:    timeout client          32s
Dec  5 08:06:47 np0005546954 ovn_metadata_agent[104423]:    timeout server          32s
Dec  5 08:06:47 np0005546954 ovn_metadata_agent[104423]:    timeout http-keep-alive 30s
Dec  5 08:06:47 np0005546954 ovn_metadata_agent[104423]: 
Dec  5 08:06:47 np0005546954 ovn_metadata_agent[104423]: 
Dec  5 08:06:47 np0005546954 ovn_metadata_agent[104423]: listen listener
Dec  5 08:06:47 np0005546954 ovn_metadata_agent[104423]:    bind 169.254.169.254:80
Dec  5 08:06:47 np0005546954 ovn_metadata_agent[104423]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 08:06:47 np0005546954 ovn_metadata_agent[104423]:    http-request add-header X-OVN-Network-ID f58a1180-800c-4113-a42d-53ba7a83ff14
Dec  5 08:06:47 np0005546954 ovn_metadata_agent[104423]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 08:06:47 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:06:47.218 104428 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f58a1180-800c-4113-a42d-53ba7a83ff14', 'env', 'PROCESS_TAG=haproxy-f58a1180-800c-4113-a42d-53ba7a83ff14', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f58a1180-800c-4113-a42d-53ba7a83ff14.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 08:06:47 np0005546954 nova_compute[187160]: 2025-12-05 13:06:47.229 187164 DEBUG nova.compute.manager [req-c4aa782b-c889-4327-abbc-b381a33373b4 req-437149c6-f019-47cd-8b22-47ea3729d8e4 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Received event network-vif-plugged-3e75686e-2bb1-41fe-b540-d4013058afc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:06:47 np0005546954 nova_compute[187160]: 2025-12-05 13:06:47.230 187164 DEBUG oslo_concurrency.lockutils [req-c4aa782b-c889-4327-abbc-b381a33373b4 req-437149c6-f019-47cd-8b22-47ea3729d8e4 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "ff3a12a0-9d4f-4501-99cd-dc404a5e80b7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:06:47 np0005546954 nova_compute[187160]: 2025-12-05 13:06:47.230 187164 DEBUG oslo_concurrency.lockutils [req-c4aa782b-c889-4327-abbc-b381a33373b4 req-437149c6-f019-47cd-8b22-47ea3729d8e4 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "ff3a12a0-9d4f-4501-99cd-dc404a5e80b7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:06:47 np0005546954 nova_compute[187160]: 2025-12-05 13:06:47.230 187164 DEBUG oslo_concurrency.lockutils [req-c4aa782b-c889-4327-abbc-b381a33373b4 req-437149c6-f019-47cd-8b22-47ea3729d8e4 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "ff3a12a0-9d4f-4501-99cd-dc404a5e80b7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:06:47 np0005546954 nova_compute[187160]: 2025-12-05 13:06:47.231 187164 DEBUG nova.compute.manager [req-c4aa782b-c889-4327-abbc-b381a33373b4 req-437149c6-f019-47cd-8b22-47ea3729d8e4 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Processing event network-vif-plugged-3e75686e-2bb1-41fe-b540-d4013058afc1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 08:06:47 np0005546954 nova_compute[187160]: 2025-12-05 13:06:47.341 187164 DEBUG nova.network.neutron [req-732aefb7-efc8-4ccb-9e04-eeb22d845d3e req-1e86e317-f93e-4f72-9afc-7abc548e41cc 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Updated VIF entry in instance network info cache for port 3e75686e-2bb1-41fe-b540-d4013058afc1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 08:06:47 np0005546954 nova_compute[187160]: 2025-12-05 13:06:47.342 187164 DEBUG nova.network.neutron [req-732aefb7-efc8-4ccb-9e04-eeb22d845d3e req-1e86e317-f93e-4f72-9afc-7abc548e41cc 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Updating instance_info_cache with network_info: [{"id": "3e75686e-2bb1-41fe-b540-d4013058afc1", "address": "fa:16:3e:c3:c2:e7", "network": {"id": "f58a1180-800c-4113-a42d-53ba7a83ff14", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-2054968628-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cd8844cdffbc42fda56f46bb649ff60d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e75686e-2b", "ovs_interfaceid": "3e75686e-2bb1-41fe-b540-d4013058afc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 08:06:47 np0005546954 nova_compute[187160]: 2025-12-05 13:06:47.356 187164 DEBUG oslo_concurrency.lockutils [req-732aefb7-efc8-4ccb-9e04-eeb22d845d3e req-1e86e317-f93e-4f72-9afc-7abc548e41cc 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Releasing lock "refresh_cache-ff3a12a0-9d4f-4501-99cd-dc404a5e80b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 08:06:47 np0005546954 podman[217862]: 2025-12-05 13:06:47.597640352 +0000 UTC m=+0.049873105 container create 948cffc971e1ebb6aada18d6673574df67ba399fb55bd11a2feb66e08ff74d97 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f58a1180-800c-4113-a42d-53ba7a83ff14, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 08:06:47 np0005546954 systemd[1]: Started libpod-conmon-948cffc971e1ebb6aada18d6673574df67ba399fb55bd11a2feb66e08ff74d97.scope.
Dec  5 08:06:47 np0005546954 podman[217862]: 2025-12-05 13:06:47.568435637 +0000 UTC m=+0.020668430 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 08:06:47 np0005546954 systemd[1]: Started libcrun container.
Dec  5 08:06:47 np0005546954 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b959b82288767c4c2a56741ee32fc66dc9d132e431e590c5fe8e4a6a0a1f3bad/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 08:06:47 np0005546954 podman[217862]: 2025-12-05 13:06:47.931311347 +0000 UTC m=+0.383544110 container init 948cffc971e1ebb6aada18d6673574df67ba399fb55bd11a2feb66e08ff74d97 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f58a1180-800c-4113-a42d-53ba7a83ff14, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true)
Dec  5 08:06:47 np0005546954 podman[217862]: 2025-12-05 13:06:47.938253701 +0000 UTC m=+0.390486444 container start 948cffc971e1ebb6aada18d6673574df67ba399fb55bd11a2feb66e08ff74d97 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f58a1180-800c-4113-a42d-53ba7a83ff14, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 08:06:47 np0005546954 nova_compute[187160]: 2025-12-05 13:06:47.948 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764940007.948089, ff3a12a0-9d4f-4501-99cd-dc404a5e80b7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 08:06:47 np0005546954 nova_compute[187160]: 2025-12-05 13:06:47.948 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] VM Started (Lifecycle Event)#033[00m
Dec  5 08:06:47 np0005546954 nova_compute[187160]: 2025-12-05 13:06:47.950 187164 DEBUG nova.compute.manager [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 08:06:47 np0005546954 nova_compute[187160]: 2025-12-05 13:06:47.953 187164 DEBUG nova.virt.libvirt.driver [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 08:06:47 np0005546954 nova_compute[187160]: 2025-12-05 13:06:47.956 187164 INFO nova.virt.libvirt.driver [-] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Instance spawned successfully.#033[00m
Dec  5 08:06:47 np0005546954 nova_compute[187160]: 2025-12-05 13:06:47.956 187164 DEBUG nova.virt.libvirt.driver [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 08:06:47 np0005546954 neutron-haproxy-ovnmeta-f58a1180-800c-4113-a42d-53ba7a83ff14[217877]: [NOTICE]   (217888) : New worker (217890) forked
Dec  5 08:06:47 np0005546954 neutron-haproxy-ovnmeta-f58a1180-800c-4113-a42d-53ba7a83ff14[217877]: [NOTICE]   (217888) : Loading success.
Dec  5 08:06:47 np0005546954 nova_compute[187160]: 2025-12-05 13:06:47.980 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 08:06:47 np0005546954 nova_compute[187160]: 2025-12-05 13:06:47.987 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 08:06:47 np0005546954 nova_compute[187160]: 2025-12-05 13:06:47.993 187164 DEBUG nova.virt.libvirt.driver [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 08:06:47 np0005546954 nova_compute[187160]: 2025-12-05 13:06:47.994 187164 DEBUG nova.virt.libvirt.driver [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 08:06:47 np0005546954 nova_compute[187160]: 2025-12-05 13:06:47.994 187164 DEBUG nova.virt.libvirt.driver [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 08:06:47 np0005546954 nova_compute[187160]: 2025-12-05 13:06:47.995 187164 DEBUG nova.virt.libvirt.driver [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 08:06:47 np0005546954 nova_compute[187160]: 2025-12-05 13:06:47.996 187164 DEBUG nova.virt.libvirt.driver [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 08:06:47 np0005546954 nova_compute[187160]: 2025-12-05 13:06:47.996 187164 DEBUG nova.virt.libvirt.driver [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 08:06:48 np0005546954 nova_compute[187160]: 2025-12-05 13:06:48.020 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 08:06:48 np0005546954 nova_compute[187160]: 2025-12-05 13:06:48.021 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764940007.9482174, ff3a12a0-9d4f-4501-99cd-dc404a5e80b7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 08:06:48 np0005546954 nova_compute[187160]: 2025-12-05 13:06:48.021 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] VM Paused (Lifecycle Event)#033[00m
Dec  5 08:06:48 np0005546954 nova_compute[187160]: 2025-12-05 13:06:48.047 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 08:06:48 np0005546954 nova_compute[187160]: 2025-12-05 13:06:48.051 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764940007.9527235, ff3a12a0-9d4f-4501-99cd-dc404a5e80b7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 08:06:48 np0005546954 nova_compute[187160]: 2025-12-05 13:06:48.051 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] VM Resumed (Lifecycle Event)#033[00m
Dec  5 08:06:48 np0005546954 nova_compute[187160]: 2025-12-05 13:06:48.059 187164 INFO nova.compute.manager [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Took 5.37 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 08:06:48 np0005546954 nova_compute[187160]: 2025-12-05 13:06:48.060 187164 DEBUG nova.compute.manager [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 08:06:48 np0005546954 nova_compute[187160]: 2025-12-05 13:06:48.071 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 08:06:48 np0005546954 nova_compute[187160]: 2025-12-05 13:06:48.074 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 08:06:48 np0005546954 nova_compute[187160]: 2025-12-05 13:06:48.116 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 08:06:48 np0005546954 nova_compute[187160]: 2025-12-05 13:06:48.139 187164 INFO nova.compute.manager [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Took 5.87 seconds to build instance.#033[00m
Dec  5 08:06:48 np0005546954 nova_compute[187160]: 2025-12-05 13:06:48.156 187164 DEBUG oslo_concurrency.lockutils [None req-432e71c0-82a9-4da5-a08b-2fdebe27115e 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Lock "ff3a12a0-9d4f-4501-99cd-dc404a5e80b7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:06:49 np0005546954 nova_compute[187160]: 2025-12-05 13:06:49.329 187164 DEBUG nova.compute.manager [req-17561c6d-b152-45c7-aff3-484f0b1d7052 req-bd5b1f31-a8f2-4af0-adb4-77aa67b1ebc6 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Received event network-vif-plugged-3e75686e-2bb1-41fe-b540-d4013058afc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:06:49 np0005546954 nova_compute[187160]: 2025-12-05 13:06:49.330 187164 DEBUG oslo_concurrency.lockutils [req-17561c6d-b152-45c7-aff3-484f0b1d7052 req-bd5b1f31-a8f2-4af0-adb4-77aa67b1ebc6 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "ff3a12a0-9d4f-4501-99cd-dc404a5e80b7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:06:49 np0005546954 nova_compute[187160]: 2025-12-05 13:06:49.330 187164 DEBUG oslo_concurrency.lockutils [req-17561c6d-b152-45c7-aff3-484f0b1d7052 req-bd5b1f31-a8f2-4af0-adb4-77aa67b1ebc6 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "ff3a12a0-9d4f-4501-99cd-dc404a5e80b7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:06:49 np0005546954 nova_compute[187160]: 2025-12-05 13:06:49.331 187164 DEBUG oslo_concurrency.lockutils [req-17561c6d-b152-45c7-aff3-484f0b1d7052 req-bd5b1f31-a8f2-4af0-adb4-77aa67b1ebc6 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "ff3a12a0-9d4f-4501-99cd-dc404a5e80b7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:06:49 np0005546954 nova_compute[187160]: 2025-12-05 13:06:49.331 187164 DEBUG nova.compute.manager [req-17561c6d-b152-45c7-aff3-484f0b1d7052 req-bd5b1f31-a8f2-4af0-adb4-77aa67b1ebc6 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] No waiting events found dispatching network-vif-plugged-3e75686e-2bb1-41fe-b540-d4013058afc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 08:06:49 np0005546954 nova_compute[187160]: 2025-12-05 13:06:49.332 187164 WARNING nova.compute.manager [req-17561c6d-b152-45c7-aff3-484f0b1d7052 req-bd5b1f31-a8f2-4af0-adb4-77aa67b1ebc6 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Received unexpected event network-vif-plugged-3e75686e-2bb1-41fe-b540-d4013058afc1 for instance with vm_state active and task_state None.#033[00m
Dec  5 08:06:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:06:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:06:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:06:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:06:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:06:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:06:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:06:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:06:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:06:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:06:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:06:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:06:51 np0005546954 nova_compute[187160]: 2025-12-05 13:06:51.155 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:06:51 np0005546954 nova_compute[187160]: 2025-12-05 13:06:51.176 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:06:52 np0005546954 podman[217900]: 2025-12-05 13:06:52.569277627 +0000 UTC m=+0.070652400 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec  5 08:06:52 np0005546954 podman[217899]: 2025-12-05 13:06:52.5732854 +0000 UTC m=+0.080016758 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  5 08:06:56 np0005546954 nova_compute[187160]: 2025-12-05 13:06:56.158 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:06:56 np0005546954 nova_compute[187160]: 2025-12-05 13:06:56.178 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:07:00 np0005546954 ovn_controller[95566]: 2025-12-05T13:07:00Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c3:c2:e7 10.100.0.7
Dec  5 08:07:00 np0005546954 ovn_controller[95566]: 2025-12-05T13:07:00Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c3:c2:e7 10.100.0.7
Dec  5 08:07:01 np0005546954 nova_compute[187160]: 2025-12-05 13:07:01.162 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:07:01 np0005546954 nova_compute[187160]: 2025-12-05 13:07:01.180 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:07:04 np0005546954 podman[217968]: 2025-12-05 13:07:04.552928086 +0000 UTC m=+0.064499878 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, container_name=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, distribution-scope=public, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible)
Dec  5 08:07:04 np0005546954 podman[217969]: 2025-12-05 13:07:04.552440652 +0000 UTC m=+0.060418433 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec  5 08:07:05 np0005546954 podman[197513]: time="2025-12-05T13:07:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:07:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:07:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  5 08:07:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:07:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3063 "" "Go-http-client/1.1"
Dec  5 08:07:06 np0005546954 nova_compute[187160]: 2025-12-05 13:07:06.182 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 08:07:06 np0005546954 nova_compute[187160]: 2025-12-05 13:07:06.184 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 08:07:06 np0005546954 nova_compute[187160]: 2025-12-05 13:07:06.184 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 08:07:06 np0005546954 nova_compute[187160]: 2025-12-05 13:07:06.184 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 08:07:06 np0005546954 nova_compute[187160]: 2025-12-05 13:07:06.215 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:07:06 np0005546954 nova_compute[187160]: 2025-12-05 13:07:06.216 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 08:07:11 np0005546954 nova_compute[187160]: 2025-12-05 13:07:11.217 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:07:11 np0005546954 nova_compute[187160]: 2025-12-05 13:07:11.218 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:07:11 np0005546954 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec  5 08:07:13 np0005546954 nova_compute[187160]: 2025-12-05 13:07:13.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:07:13 np0005546954 nova_compute[187160]: 2025-12-05 13:07:13.039 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 08:07:13 np0005546954 nova_compute[187160]: 2025-12-05 13:07:13.039 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 08:07:13 np0005546954 nova_compute[187160]: 2025-12-05 13:07:13.953 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "refresh_cache-ff3a12a0-9d4f-4501-99cd-dc404a5e80b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 08:07:13 np0005546954 nova_compute[187160]: 2025-12-05 13:07:13.954 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquired lock "refresh_cache-ff3a12a0-9d4f-4501-99cd-dc404a5e80b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 08:07:13 np0005546954 nova_compute[187160]: 2025-12-05 13:07:13.954 187164 DEBUG nova.network.neutron [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec  5 08:07:13 np0005546954 nova_compute[187160]: 2025-12-05 13:07:13.954 187164 DEBUG nova.objects.instance [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lazy-loading 'info_cache' on Instance uuid ff3a12a0-9d4f-4501-99cd-dc404a5e80b7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 08:07:15 np0005546954 nova_compute[187160]: 2025-12-05 13:07:15.367 187164 DEBUG nova.network.neutron [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Updating instance_info_cache with network_info: [{"id": "3e75686e-2bb1-41fe-b540-d4013058afc1", "address": "fa:16:3e:c3:c2:e7", "network": {"id": "f58a1180-800c-4113-a42d-53ba7a83ff14", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-2054968628-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cd8844cdffbc42fda56f46bb649ff60d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e75686e-2b", "ovs_interfaceid": "3e75686e-2bb1-41fe-b540-d4013058afc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 08:07:15 np0005546954 nova_compute[187160]: 2025-12-05 13:07:15.388 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Releasing lock "refresh_cache-ff3a12a0-9d4f-4501-99cd-dc404a5e80b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 08:07:15 np0005546954 nova_compute[187160]: 2025-12-05 13:07:15.388 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec  5 08:07:15 np0005546954 nova_compute[187160]: 2025-12-05 13:07:15.388 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:07:15 np0005546954 nova_compute[187160]: 2025-12-05 13:07:15.389 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:07:16 np0005546954 nova_compute[187160]: 2025-12-05 13:07:16.218 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:07:16 np0005546954 nova_compute[187160]: 2025-12-05 13:07:16.219 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:07:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:07:16.971 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:07:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:07:16.972 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:07:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:07:16.972 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:07:17 np0005546954 nova_compute[187160]: 2025-12-05 13:07:17.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:07:17 np0005546954 podman[218014]: 2025-12-05 13:07:17.573442588 +0000 UTC m=+0.082830025 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec  5 08:07:18 np0005546954 nova_compute[187160]: 2025-12-05 13:07:18.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:07:19 np0005546954 nova_compute[187160]: 2025-12-05 13:07:19.034 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:07:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:07:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:07:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:07:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:07:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:07:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:07:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:07:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:07:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:07:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:07:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:07:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:07:20 np0005546954 nova_compute[187160]: 2025-12-05 13:07:20.041 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:07:20 np0005546954 nova_compute[187160]: 2025-12-05 13:07:20.063 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:07:20 np0005546954 nova_compute[187160]: 2025-12-05 13:07:20.064 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:07:20 np0005546954 nova_compute[187160]: 2025-12-05 13:07:20.064 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:07:20 np0005546954 nova_compute[187160]: 2025-12-05 13:07:20.064 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 08:07:20 np0005546954 nova_compute[187160]: 2025-12-05 13:07:20.135 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff3a12a0-9d4f-4501-99cd-dc404a5e80b7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:07:20 np0005546954 nova_compute[187160]: 2025-12-05 13:07:20.210 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff3a12a0-9d4f-4501-99cd-dc404a5e80b7/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:07:20 np0005546954 nova_compute[187160]: 2025-12-05 13:07:20.210 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff3a12a0-9d4f-4501-99cd-dc404a5e80b7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:07:20 np0005546954 nova_compute[187160]: 2025-12-05 13:07:20.271 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff3a12a0-9d4f-4501-99cd-dc404a5e80b7/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:07:20 np0005546954 nova_compute[187160]: 2025-12-05 13:07:20.405 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 08:07:20 np0005546954 nova_compute[187160]: 2025-12-05 13:07:20.406 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5666MB free_disk=73.3002700805664GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 08:07:20 np0005546954 nova_compute[187160]: 2025-12-05 13:07:20.406 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:07:20 np0005546954 nova_compute[187160]: 2025-12-05 13:07:20.407 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:07:20 np0005546954 nova_compute[187160]: 2025-12-05 13:07:20.490 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Instance ff3a12a0-9d4f-4501-99cd-dc404a5e80b7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 08:07:20 np0005546954 nova_compute[187160]: 2025-12-05 13:07:20.490 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 08:07:20 np0005546954 nova_compute[187160]: 2025-12-05 13:07:20.491 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 08:07:20 np0005546954 nova_compute[187160]: 2025-12-05 13:07:20.529 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 08:07:20 np0005546954 nova_compute[187160]: 2025-12-05 13:07:20.545 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 08:07:20 np0005546954 nova_compute[187160]: 2025-12-05 13:07:20.567 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 08:07:20 np0005546954 nova_compute[187160]: 2025-12-05 13:07:20.567 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:07:21 np0005546954 nova_compute[187160]: 2025-12-05 13:07:21.221 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:07:23 np0005546954 nova_compute[187160]: 2025-12-05 13:07:23.566 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:07:23 np0005546954 nova_compute[187160]: 2025-12-05 13:07:23.567 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 08:07:23 np0005546954 podman[218041]: 2025-12-05 13:07:23.583760633 +0000 UTC m=+0.078352018 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 08:07:23 np0005546954 podman[218040]: 2025-12-05 13:07:23.600067278 +0000 UTC m=+0.112742183 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec  5 08:07:25 np0005546954 nova_compute[187160]: 2025-12-05 13:07:25.035 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:07:26 np0005546954 nova_compute[187160]: 2025-12-05 13:07:26.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:07:26 np0005546954 nova_compute[187160]: 2025-12-05 13:07:26.222 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:07:29 np0005546954 ovn_controller[95566]: 2025-12-05T13:07:29Z|00241|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Dec  5 08:07:31 np0005546954 nova_compute[187160]: 2025-12-05 13:07:31.224 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:07:35 np0005546954 podman[218089]: 2025-12-05 13:07:35.554081842 +0000 UTC m=+0.057666958 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125)
Dec  5 08:07:35 np0005546954 podman[218088]: 2025-12-05 13:07:35.566162116 +0000 UTC m=+0.064833029 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, version=9.6, managed_by=edpm_ansible, release=1755695350, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, config_id=edpm, container_name=openstack_network_exporter, distribution-scope=public, architecture=x86_64, io.openshift.tags=minimal rhel9, vcs-type=git, name=ubi9-minimal, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.)
Dec  5 08:07:35 np0005546954 podman[197513]: time="2025-12-05T13:07:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:07:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:07:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  5 08:07:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:07:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3056 "" "Go-http-client/1.1"
Dec  5 08:07:36 np0005546954 nova_compute[187160]: 2025-12-05 13:07:36.226 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:07:41 np0005546954 nova_compute[187160]: 2025-12-05 13:07:41.228 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:07:46 np0005546954 nova_compute[187160]: 2025-12-05 13:07:46.231 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 08:07:48 np0005546954 podman[218137]: 2025-12-05 13:07:48.59636868 +0000 UTC m=+0.099694649 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec  5 08:07:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:07:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:07:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:07:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:07:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:07:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:07:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:07:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:07:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:07:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:07:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:07:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:07:51 np0005546954 nova_compute[187160]: 2025-12-05 13:07:51.232 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 08:07:51 np0005546954 nova_compute[187160]: 2025-12-05 13:07:51.233 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:07:51 np0005546954 nova_compute[187160]: 2025-12-05 13:07:51.234 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 08:07:51 np0005546954 nova_compute[187160]: 2025-12-05 13:07:51.234 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 08:07:51 np0005546954 nova_compute[187160]: 2025-12-05 13:07:51.234 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 08:07:51 np0005546954 nova_compute[187160]: 2025-12-05 13:07:51.235 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:07:54 np0005546954 podman[218158]: 2025-12-05 13:07:54.558814999 +0000 UTC m=+0.059320407 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 08:07:54 np0005546954 podman[218157]: 2025-12-05 13:07:54.600860381 +0000 UTC m=+0.117837780 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec  5 08:07:56 np0005546954 nova_compute[187160]: 2025-12-05 13:07:56.234 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:08:01 np0005546954 nova_compute[187160]: 2025-12-05 13:08:01.237 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:08:05 np0005546954 podman[197513]: time="2025-12-05T13:08:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:08:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:08:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  5 08:08:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:08:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3057 "" "Go-http-client/1.1"
Dec  5 08:08:06 np0005546954 nova_compute[187160]: 2025-12-05 13:08:06.238 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:08:06 np0005546954 nova_compute[187160]: 2025-12-05 13:08:06.240 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:08:06 np0005546954 podman[218208]: 2025-12-05 13:08:06.546122714 +0000 UTC m=+0.060072462 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec  5 08:08:06 np0005546954 podman[218207]: 2025-12-05 13:08:06.580133246 +0000 UTC m=+0.095933471 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_id=edpm, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, managed_by=edpm_ansible, release=1755695350, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, name=ubi9-minimal)
Dec  5 08:08:11 np0005546954 nova_compute[187160]: 2025-12-05 13:08:11.240 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 08:08:11 np0005546954 nova_compute[187160]: 2025-12-05 13:08:11.243 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 08:08:11 np0005546954 nova_compute[187160]: 2025-12-05 13:08:11.243 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 08:08:11 np0005546954 nova_compute[187160]: 2025-12-05 13:08:11.244 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 08:08:11 np0005546954 nova_compute[187160]: 2025-12-05 13:08:11.297 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:08:11 np0005546954 nova_compute[187160]: 2025-12-05 13:08:11.298 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 08:08:13 np0005546954 nova_compute[187160]: 2025-12-05 13:08:13.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:08:13 np0005546954 nova_compute[187160]: 2025-12-05 13:08:13.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 08:08:13 np0005546954 nova_compute[187160]: 2025-12-05 13:08:13.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 08:08:13 np0005546954 nova_compute[187160]: 2025-12-05 13:08:13.425 187164 DEBUG nova.virt.libvirt.driver [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bdbf5388-d51b-4523-9277-e4b5ce90ed5a] Creating tmpfile /var/lib/nova/instances/tmpnexggvje to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Dec  5 08:08:13 np0005546954 nova_compute[187160]: 2025-12-05 13:08:13.426 187164 DEBUG nova.compute.manager [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpnexggvje',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Dec  5 08:08:13 np0005546954 nova_compute[187160]: 2025-12-05 13:08:13.502 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "refresh_cache-ff3a12a0-9d4f-4501-99cd-dc404a5e80b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 08:08:13 np0005546954 nova_compute[187160]: 2025-12-05 13:08:13.502 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquired lock "refresh_cache-ff3a12a0-9d4f-4501-99cd-dc404a5e80b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 08:08:13 np0005546954 nova_compute[187160]: 2025-12-05 13:08:13.502 187164 DEBUG nova.network.neutron [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec  5 08:08:13 np0005546954 nova_compute[187160]: 2025-12-05 13:08:13.503 187164 DEBUG nova.objects.instance [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lazy-loading 'info_cache' on Instance uuid ff3a12a0-9d4f-4501-99cd-dc404a5e80b7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 08:08:14 np0005546954 nova_compute[187160]: 2025-12-05 13:08:14.662 187164 DEBUG nova.network.neutron [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Updating instance_info_cache with network_info: [{"id": "3e75686e-2bb1-41fe-b540-d4013058afc1", "address": "fa:16:3e:c3:c2:e7", "network": {"id": "f58a1180-800c-4113-a42d-53ba7a83ff14", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-2054968628-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cd8844cdffbc42fda56f46bb649ff60d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e75686e-2b", "ovs_interfaceid": "3e75686e-2bb1-41fe-b540-d4013058afc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 08:08:14 np0005546954 nova_compute[187160]: 2025-12-05 13:08:14.952 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Releasing lock "refresh_cache-ff3a12a0-9d4f-4501-99cd-dc404a5e80b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 08:08:14 np0005546954 nova_compute[187160]: 2025-12-05 13:08:14.952 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec  5 08:08:14 np0005546954 nova_compute[187160]: 2025-12-05 13:08:14.953 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:08:14 np0005546954 nova_compute[187160]: 2025-12-05 13:08:14.958 187164 DEBUG nova.compute.manager [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpnexggvje',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='bdbf5388-d51b-4523-9277-e4b5ce90ed5a',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Dec  5 08:08:15 np0005546954 nova_compute[187160]: 2025-12-05 13:08:15.022 187164 DEBUG oslo_concurrency.lockutils [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "refresh_cache-bdbf5388-d51b-4523-9277-e4b5ce90ed5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 08:08:15 np0005546954 nova_compute[187160]: 2025-12-05 13:08:15.023 187164 DEBUG oslo_concurrency.lockutils [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquired lock "refresh_cache-bdbf5388-d51b-4523-9277-e4b5ce90ed5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 08:08:15 np0005546954 nova_compute[187160]: 2025-12-05 13:08:15.024 187164 DEBUG nova.network.neutron [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bdbf5388-d51b-4523-9277-e4b5ce90ed5a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 08:08:16 np0005546954 nova_compute[187160]: 2025-12-05 13:08:16.299 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 08:08:16 np0005546954 nova_compute[187160]: 2025-12-05 13:08:16.300 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:08:16 np0005546954 nova_compute[187160]: 2025-12-05 13:08:16.300 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 08:08:16 np0005546954 nova_compute[187160]: 2025-12-05 13:08:16.300 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 08:08:16 np0005546954 nova_compute[187160]: 2025-12-05 13:08:16.301 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 08:08:16 np0005546954 nova_compute[187160]: 2025-12-05 13:08:16.303 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 08:08:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:16.973 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:08:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:16.974 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:08:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:16.975 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:08:17 np0005546954 nova_compute[187160]: 2025-12-05 13:08:17.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:08:17 np0005546954 nova_compute[187160]: 2025-12-05 13:08:17.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:08:18 np0005546954 nova_compute[187160]: 2025-12-05 13:08:18.236 187164 DEBUG nova.network.neutron [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bdbf5388-d51b-4523-9277-e4b5ce90ed5a] Updating instance_info_cache with network_info: [{"id": "2e70bf4b-d3f4-4499-a0bf-5eee59ffcbfe", "address": "fa:16:3e:e5:5a:7e", "network": {"id": "f58a1180-800c-4113-a42d-53ba7a83ff14", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-2054968628-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cd8844cdffbc42fda56f46bb649ff60d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e70bf4b-d3", "ovs_interfaceid": "2e70bf4b-d3f4-4499-a0bf-5eee59ffcbfe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 08:08:18 np0005546954 nova_compute[187160]: 2025-12-05 13:08:18.316 187164 DEBUG oslo_concurrency.lockutils [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Releasing lock "refresh_cache-bdbf5388-d51b-4523-9277-e4b5ce90ed5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 08:08:18 np0005546954 nova_compute[187160]: 2025-12-05 13:08:18.318 187164 DEBUG nova.virt.libvirt.driver [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bdbf5388-d51b-4523-9277-e4b5ce90ed5a] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpnexggvje',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='bdbf5388-d51b-4523-9277-e4b5ce90ed5a',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Dec  5 08:08:18 np0005546954 nova_compute[187160]: 2025-12-05 13:08:18.318 187164 DEBUG nova.virt.libvirt.driver [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bdbf5388-d51b-4523-9277-e4b5ce90ed5a] Creating instance directory: /var/lib/nova/instances/bdbf5388-d51b-4523-9277-e4b5ce90ed5a pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Dec  5 08:08:18 np0005546954 nova_compute[187160]: 2025-12-05 13:08:18.319 187164 DEBUG nova.virt.libvirt.driver [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bdbf5388-d51b-4523-9277-e4b5ce90ed5a] Creating disk.info with the contents: {'/var/lib/nova/instances/bdbf5388-d51b-4523-9277-e4b5ce90ed5a/disk': 'qcow2', '/var/lib/nova/instances/bdbf5388-d51b-4523-9277-e4b5ce90ed5a/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Dec  5 08:08:18 np0005546954 nova_compute[187160]: 2025-12-05 13:08:18.319 187164 DEBUG nova.virt.libvirt.driver [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bdbf5388-d51b-4523-9277-e4b5ce90ed5a] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Dec  5 08:08:18 np0005546954 nova_compute[187160]: 2025-12-05 13:08:18.320 187164 DEBUG nova.objects.instance [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lazy-loading 'trusted_certs' on Instance uuid bdbf5388-d51b-4523-9277-e4b5ce90ed5a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 08:08:18 np0005546954 nova_compute[187160]: 2025-12-05 13:08:18.350 187164 DEBUG oslo_concurrency.processutils [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:08:18 np0005546954 nova_compute[187160]: 2025-12-05 13:08:18.442 187164 DEBUG oslo_concurrency.processutils [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:08:18 np0005546954 nova_compute[187160]: 2025-12-05 13:08:18.444 187164 DEBUG oslo_concurrency.lockutils [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:08:18 np0005546954 nova_compute[187160]: 2025-12-05 13:08:18.445 187164 DEBUG oslo_concurrency.lockutils [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:08:18 np0005546954 nova_compute[187160]: 2025-12-05 13:08:18.469 187164 DEBUG oslo_concurrency.processutils [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:08:18 np0005546954 nova_compute[187160]: 2025-12-05 13:08:18.524 187164 DEBUG oslo_concurrency.processutils [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:08:18 np0005546954 nova_compute[187160]: 2025-12-05 13:08:18.526 187164 DEBUG oslo_concurrency.processutils [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/bdbf5388-d51b-4523-9277-e4b5ce90ed5a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:08:18 np0005546954 nova_compute[187160]: 2025-12-05 13:08:18.578 187164 DEBUG oslo_concurrency.processutils [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/bdbf5388-d51b-4523-9277-e4b5ce90ed5a/disk 1073741824" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:08:18 np0005546954 nova_compute[187160]: 2025-12-05 13:08:18.579 187164 DEBUG oslo_concurrency.lockutils [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:08:18 np0005546954 nova_compute[187160]: 2025-12-05 13:08:18.580 187164 DEBUG oslo_concurrency.processutils [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:08:18 np0005546954 nova_compute[187160]: 2025-12-05 13:08:18.638 187164 DEBUG oslo_concurrency.processutils [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:08:18 np0005546954 nova_compute[187160]: 2025-12-05 13:08:18.639 187164 DEBUG nova.virt.disk.api [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Checking if we can resize image /var/lib/nova/instances/bdbf5388-d51b-4523-9277-e4b5ce90ed5a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 08:08:18 np0005546954 nova_compute[187160]: 2025-12-05 13:08:18.640 187164 DEBUG oslo_concurrency.processutils [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bdbf5388-d51b-4523-9277-e4b5ce90ed5a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:08:18 np0005546954 nova_compute[187160]: 2025-12-05 13:08:18.728 187164 DEBUG oslo_concurrency.processutils [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bdbf5388-d51b-4523-9277-e4b5ce90ed5a/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:08:18 np0005546954 nova_compute[187160]: 2025-12-05 13:08:18.730 187164 DEBUG nova.virt.disk.api [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Cannot resize image /var/lib/nova/instances/bdbf5388-d51b-4523-9277-e4b5ce90ed5a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 08:08:18 np0005546954 nova_compute[187160]: 2025-12-05 13:08:18.730 187164 DEBUG nova.objects.instance [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lazy-loading 'migration_context' on Instance uuid bdbf5388-d51b-4523-9277-e4b5ce90ed5a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 08:08:18 np0005546954 nova_compute[187160]: 2025-12-05 13:08:18.780 187164 DEBUG oslo_concurrency.processutils [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/bdbf5388-d51b-4523-9277-e4b5ce90ed5a/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:08:18 np0005546954 nova_compute[187160]: 2025-12-05 13:08:18.812 187164 DEBUG oslo_concurrency.processutils [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/bdbf5388-d51b-4523-9277-e4b5ce90ed5a/disk.config 485376" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:08:18 np0005546954 nova_compute[187160]: 2025-12-05 13:08:18.817 187164 DEBUG nova.virt.libvirt.volume.remotefs [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/bdbf5388-d51b-4523-9277-e4b5ce90ed5a/disk.config to /var/lib/nova/instances/bdbf5388-d51b-4523-9277-e4b5ce90ed5a copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Dec  5 08:08:18 np0005546954 nova_compute[187160]: 2025-12-05 13:08:18.818 187164 DEBUG oslo_concurrency.processutils [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/bdbf5388-d51b-4523-9277-e4b5ce90ed5a/disk.config /var/lib/nova/instances/bdbf5388-d51b-4523-9277-e4b5ce90ed5a execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:08:19 np0005546954 nova_compute[187160]: 2025-12-05 13:08:19.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:08:19 np0005546954 nova_compute[187160]: 2025-12-05 13:08:19.260 187164 DEBUG oslo_concurrency.processutils [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/bdbf5388-d51b-4523-9277-e4b5ce90ed5a/disk.config /var/lib/nova/instances/bdbf5388-d51b-4523-9277-e4b5ce90ed5a" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:08:19 np0005546954 nova_compute[187160]: 2025-12-05 13:08:19.261 187164 DEBUG nova.virt.libvirt.driver [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bdbf5388-d51b-4523-9277-e4b5ce90ed5a] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Dec  5 08:08:19 np0005546954 nova_compute[187160]: 2025-12-05 13:08:19.262 187164 DEBUG nova.virt.libvirt.vif [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T13:06:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1515208707',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1515208707',id=26,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T13:07:06Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='cd8844cdffbc42fda56f46bb649ff60d',ramdisk_id='',reservation_id='r-sv42ywpi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1594092090',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1594092090-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T13:07:06Z,user_data=None,user_id='2ac3cc621ecc4823aed54d0815090a78',uuid=bdbf5388-d51b-4523-9277-e4b5ce90ed5a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2e70bf4b-d3f4-4499-a0bf-5eee59ffcbfe", "address": "fa:16:3e:e5:5a:7e", "network": {"id": "f58a1180-800c-4113-a42d-53ba7a83ff14", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-2054968628-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cd8844cdffbc42fda56f46bb649ff60d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap2e70bf4b-d3", "ovs_interfaceid": "2e70bf4b-d3f4-4499-a0bf-5eee59ffcbfe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 08:08:19 np0005546954 nova_compute[187160]: 2025-12-05 13:08:19.263 187164 DEBUG nova.network.os_vif_util [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Converting VIF {"id": "2e70bf4b-d3f4-4499-a0bf-5eee59ffcbfe", "address": "fa:16:3e:e5:5a:7e", "network": {"id": "f58a1180-800c-4113-a42d-53ba7a83ff14", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-2054968628-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cd8844cdffbc42fda56f46bb649ff60d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap2e70bf4b-d3", "ovs_interfaceid": "2e70bf4b-d3f4-4499-a0bf-5eee59ffcbfe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 08:08:19 np0005546954 nova_compute[187160]: 2025-12-05 13:08:19.264 187164 DEBUG nova.network.os_vif_util [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e5:5a:7e,bridge_name='br-int',has_traffic_filtering=True,id=2e70bf4b-d3f4-4499-a0bf-5eee59ffcbfe,network=Network(f58a1180-800c-4113-a42d-53ba7a83ff14),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e70bf4b-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 08:08:19 np0005546954 nova_compute[187160]: 2025-12-05 13:08:19.265 187164 DEBUG os_vif [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e5:5a:7e,bridge_name='br-int',has_traffic_filtering=True,id=2e70bf4b-d3f4-4499-a0bf-5eee59ffcbfe,network=Network(f58a1180-800c-4113-a42d-53ba7a83ff14),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e70bf4b-d3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 08:08:19 np0005546954 nova_compute[187160]: 2025-12-05 13:08:19.266 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:08:19 np0005546954 nova_compute[187160]: 2025-12-05 13:08:19.266 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:08:19 np0005546954 nova_compute[187160]: 2025-12-05 13:08:19.267 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 08:08:19 np0005546954 nova_compute[187160]: 2025-12-05 13:08:19.270 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:08:19 np0005546954 nova_compute[187160]: 2025-12-05 13:08:19.271 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2e70bf4b-d3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:08:19 np0005546954 nova_compute[187160]: 2025-12-05 13:08:19.271 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2e70bf4b-d3, col_values=(('external_ids', {'iface-id': '2e70bf4b-d3f4-4499-a0bf-5eee59ffcbfe', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e5:5a:7e', 'vm-uuid': 'bdbf5388-d51b-4523-9277-e4b5ce90ed5a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:08:19 np0005546954 nova_compute[187160]: 2025-12-05 13:08:19.274 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:08:19 np0005546954 NetworkManager[55665]: <info>  [1764940099.2756] manager: (tap2e70bf4b-d3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/95)
Dec  5 08:08:19 np0005546954 nova_compute[187160]: 2025-12-05 13:08:19.277 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 08:08:19 np0005546954 nova_compute[187160]: 2025-12-05 13:08:19.282 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:08:19 np0005546954 nova_compute[187160]: 2025-12-05 13:08:19.283 187164 INFO os_vif [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e5:5a:7e,bridge_name='br-int',has_traffic_filtering=True,id=2e70bf4b-d3f4-4499-a0bf-5eee59ffcbfe,network=Network(f58a1180-800c-4113-a42d-53ba7a83ff14),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e70bf4b-d3')#033[00m
Dec  5 08:08:19 np0005546954 nova_compute[187160]: 2025-12-05 13:08:19.284 187164 DEBUG nova.virt.libvirt.driver [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Dec  5 08:08:19 np0005546954 nova_compute[187160]: 2025-12-05 13:08:19.284 187164 DEBUG nova.compute.manager [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpnexggvje',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='bdbf5388-d51b-4523-9277-e4b5ce90ed5a',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Dec  5 08:08:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:08:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:08:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:08:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:08:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:08:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:08:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:08:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:08:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:08:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:08:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:08:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:08:19 np0005546954 podman[218272]: 2025-12-05 13:08:19.543930131 +0000 UTC m=+0.057011037 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125)
Dec  5 08:08:20 np0005546954 nova_compute[187160]: 2025-12-05 13:08:20.034 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:08:20 np0005546954 nova_compute[187160]: 2025-12-05 13:08:20.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:08:20 np0005546954 nova_compute[187160]: 2025-12-05 13:08:20.233 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:08:20 np0005546954 nova_compute[187160]: 2025-12-05 13:08:20.234 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:08:20 np0005546954 nova_compute[187160]: 2025-12-05 13:08:20.234 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:08:20 np0005546954 nova_compute[187160]: 2025-12-05 13:08:20.235 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 08:08:20 np0005546954 nova_compute[187160]: 2025-12-05 13:08:20.427 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff3a12a0-9d4f-4501-99cd-dc404a5e80b7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:08:20 np0005546954 nova_compute[187160]: 2025-12-05 13:08:20.518 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff3a12a0-9d4f-4501-99cd-dc404a5e80b7/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:08:20 np0005546954 nova_compute[187160]: 2025-12-05 13:08:20.519 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff3a12a0-9d4f-4501-99cd-dc404a5e80b7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:08:20 np0005546954 nova_compute[187160]: 2025-12-05 13:08:20.590 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff3a12a0-9d4f-4501-99cd-dc404a5e80b7/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:08:20 np0005546954 nova_compute[187160]: 2025-12-05 13:08:20.796 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 08:08:20 np0005546954 nova_compute[187160]: 2025-12-05 13:08:20.797 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5702MB free_disk=73.29960632324219GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 08:08:20 np0005546954 nova_compute[187160]: 2025-12-05 13:08:20.798 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:08:20 np0005546954 nova_compute[187160]: 2025-12-05 13:08:20.798 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:08:20 np0005546954 nova_compute[187160]: 2025-12-05 13:08:20.848 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Migration for instance bdbf5388-d51b-4523-9277-e4b5ce90ed5a refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Dec  5 08:08:20 np0005546954 nova_compute[187160]: 2025-12-05 13:08:20.866 187164 INFO nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: bdbf5388-d51b-4523-9277-e4b5ce90ed5a] Updating resource usage from migration e68d700f-bd47-4545-b32e-5e978fd5faac#033[00m
Dec  5 08:08:20 np0005546954 nova_compute[187160]: 2025-12-05 13:08:20.867 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: bdbf5388-d51b-4523-9277-e4b5ce90ed5a] Starting to track incoming migration e68d700f-bd47-4545-b32e-5e978fd5faac with flavor b4ea63be-97f8-4a48-b000-66321c4ddb27 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Dec  5 08:08:20 np0005546954 nova_compute[187160]: 2025-12-05 13:08:20.907 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Instance ff3a12a0-9d4f-4501-99cd-dc404a5e80b7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 08:08:20 np0005546954 nova_compute[187160]: 2025-12-05 13:08:20.937 187164 WARNING nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Instance bdbf5388-d51b-4523-9277-e4b5ce90ed5a has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.#033[00m
Dec  5 08:08:20 np0005546954 nova_compute[187160]: 2025-12-05 13:08:20.937 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 08:08:20 np0005546954 nova_compute[187160]: 2025-12-05 13:08:20.938 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 08:08:20 np0005546954 nova_compute[187160]: 2025-12-05 13:08:20.988 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 08:08:21 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:21.204 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2a:56:46', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '32:90:88:ab:74:32'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 08:08:21 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:21.205 104428 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 08:08:21 np0005546954 nova_compute[187160]: 2025-12-05 13:08:21.220 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 08:08:21 np0005546954 nova_compute[187160]: 2025-12-05 13:08:21.241 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:08:21 np0005546954 nova_compute[187160]: 2025-12-05 13:08:21.254 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 08:08:21 np0005546954 nova_compute[187160]: 2025-12-05 13:08:21.255 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.457s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:08:21 np0005546954 nova_compute[187160]: 2025-12-05 13:08:21.302 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:08:22 np0005546954 nova_compute[187160]: 2025-12-05 13:08:22.020 187164 DEBUG nova.network.neutron [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bdbf5388-d51b-4523-9277-e4b5ce90ed5a] Port 2e70bf4b-d3f4-4499-a0bf-5eee59ffcbfe updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Dec  5 08:08:22 np0005546954 nova_compute[187160]: 2025-12-05 13:08:22.022 187164 DEBUG nova.compute.manager [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpnexggvje',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='bdbf5388-d51b-4523-9277-e4b5ce90ed5a',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Dec  5 08:08:22 np0005546954 systemd[1]: Starting libvirt proxy daemon...
Dec  5 08:08:22 np0005546954 systemd[1]: Started libvirt proxy daemon.
Dec  5 08:08:22 np0005546954 kernel: tap2e70bf4b-d3: entered promiscuous mode
Dec  5 08:08:22 np0005546954 NetworkManager[55665]: <info>  [1764940102.2896] manager: (tap2e70bf4b-d3): new Tun device (/org/freedesktop/NetworkManager/Devices/96)
Dec  5 08:08:22 np0005546954 ovn_controller[95566]: 2025-12-05T13:08:22Z|00242|binding|INFO|Claiming lport 2e70bf4b-d3f4-4499-a0bf-5eee59ffcbfe for this additional chassis.
Dec  5 08:08:22 np0005546954 ovn_controller[95566]: 2025-12-05T13:08:22Z|00243|binding|INFO|2e70bf4b-d3f4-4499-a0bf-5eee59ffcbfe: Claiming fa:16:3e:e5:5a:7e 10.100.0.14
Dec  5 08:08:22 np0005546954 nova_compute[187160]: 2025-12-05 13:08:22.336 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:08:22 np0005546954 ovn_controller[95566]: 2025-12-05T13:08:22Z|00244|binding|INFO|Setting lport 2e70bf4b-d3f4-4499-a0bf-5eee59ffcbfe ovn-installed in OVS
Dec  5 08:08:22 np0005546954 nova_compute[187160]: 2025-12-05 13:08:22.348 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:08:22 np0005546954 nova_compute[187160]: 2025-12-05 13:08:22.350 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:08:22 np0005546954 systemd-udevd[218331]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 08:08:22 np0005546954 systemd-machined[153497]: New machine qemu-24-instance-0000001a.
Dec  5 08:08:22 np0005546954 NetworkManager[55665]: <info>  [1764940102.3691] device (tap2e70bf4b-d3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 08:08:22 np0005546954 NetworkManager[55665]: <info>  [1764940102.3700] device (tap2e70bf4b-d3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 08:08:22 np0005546954 systemd[1]: Started Virtual Machine qemu-24-instance-0000001a.
Dec  5 08:08:22 np0005546954 nova_compute[187160]: 2025-12-05 13:08:22.820 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764940102.8200214, bdbf5388-d51b-4523-9277-e4b5ce90ed5a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 08:08:22 np0005546954 nova_compute[187160]: 2025-12-05 13:08:22.821 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: bdbf5388-d51b-4523-9277-e4b5ce90ed5a] VM Started (Lifecycle Event)#033[00m
Dec  5 08:08:22 np0005546954 nova_compute[187160]: 2025-12-05 13:08:22.851 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: bdbf5388-d51b-4523-9277-e4b5ce90ed5a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 08:08:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:23.208 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f9f74c-08f9-451f-9678-93bb9e8fa2fe, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:08:23 np0005546954 nova_compute[187160]: 2025-12-05 13:08:23.910 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764940103.9098454, bdbf5388-d51b-4523-9277-e4b5ce90ed5a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 08:08:23 np0005546954 nova_compute[187160]: 2025-12-05 13:08:23.910 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: bdbf5388-d51b-4523-9277-e4b5ce90ed5a] VM Resumed (Lifecycle Event)#033[00m
Dec  5 08:08:24 np0005546954 nova_compute[187160]: 2025-12-05 13:08:23.999 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: bdbf5388-d51b-4523-9277-e4b5ce90ed5a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 08:08:24 np0005546954 nova_compute[187160]: 2025-12-05 13:08:24.005 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: bdbf5388-d51b-4523-9277-e4b5ce90ed5a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 08:08:24 np0005546954 nova_compute[187160]: 2025-12-05 13:08:24.023 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: bdbf5388-d51b-4523-9277-e4b5ce90ed5a] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Dec  5 08:08:24 np0005546954 nova_compute[187160]: 2025-12-05 13:08:24.275 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:08:25 np0005546954 nova_compute[187160]: 2025-12-05 13:08:25.256 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:08:25 np0005546954 nova_compute[187160]: 2025-12-05 13:08:25.257 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 08:08:25 np0005546954 ovn_controller[95566]: 2025-12-05T13:08:25Z|00245|binding|INFO|Claiming lport 2e70bf4b-d3f4-4499-a0bf-5eee59ffcbfe for this chassis.
Dec  5 08:08:25 np0005546954 ovn_controller[95566]: 2025-12-05T13:08:25Z|00246|binding|INFO|2e70bf4b-d3f4-4499-a0bf-5eee59ffcbfe: Claiming fa:16:3e:e5:5a:7e 10.100.0.14
Dec  5 08:08:25 np0005546954 ovn_controller[95566]: 2025-12-05T13:08:25Z|00247|binding|INFO|Setting lport 2e70bf4b-d3f4-4499-a0bf-5eee59ffcbfe up in Southbound
Dec  5 08:08:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:25.484 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e5:5a:7e 10.100.0.14'], port_security=['fa:16:3e:e5:5a:7e 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'bdbf5388-d51b-4523-9277-e4b5ce90ed5a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f58a1180-800c-4113-a42d-53ba7a83ff14', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cd8844cdffbc42fda56f46bb649ff60d', 'neutron:revision_number': '11', 'neutron:security_group_ids': '58c01b8f-2f37-4998-87d2-d107cf040b9e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=466363b7-72e5-437f-abf8-f548593b34f9, chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=2e70bf4b-d3f4-4499-a0bf-5eee59ffcbfe) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 08:08:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:25.486 104428 INFO neutron.agent.ovn.metadata.agent [-] Port 2e70bf4b-d3f4-4499-a0bf-5eee59ffcbfe in datapath f58a1180-800c-4113-a42d-53ba7a83ff14 bound to our chassis#033[00m
Dec  5 08:08:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:25.488 104428 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f58a1180-800c-4113-a42d-53ba7a83ff14#033[00m
Dec  5 08:08:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:25.509 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[dc86eebb-8efa-4297-8c66-cee136f882ff]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:08:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:25.538 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[e80041b8-1588-4324-abe7-13c7eb1bc8b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:08:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:25.542 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[41e4ec2d-ea8f-425f-a37d-d05da306cc40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:08:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:25.573 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[32da8100-187e-4b10-a688-9106aab3c982]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:08:25 np0005546954 podman[218355]: 2025-12-05 13:08:25.584219423 +0000 UTC m=+0.081149305 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  5 08:08:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:25.593 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[071c36e5-846f-40c9-8afe-faa84becba76]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf58a1180-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:49:5d:63'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501364, 'reachable_time': 41115, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218406, 'error': None, 'target': 'ovnmeta-f58a1180-800c-4113-a42d-53ba7a83ff14', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:08:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:25.608 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[7aa8d1d9-e882-4a0a-b548-879a4e8ca3f0]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf58a1180-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 501377, 'tstamp': 501377}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218407, 'error': None, 'target': 'ovnmeta-f58a1180-800c-4113-a42d-53ba7a83ff14', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf58a1180-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 501381, 'tstamp': 501381}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218407, 'error': None, 'target': 'ovnmeta-f58a1180-800c-4113-a42d-53ba7a83ff14', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:08:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:25.610 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf58a1180-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:08:25 np0005546954 podman[218354]: 2025-12-05 13:08:25.613932573 +0000 UTC m=+0.110695169 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  5 08:08:25 np0005546954 nova_compute[187160]: 2025-12-05 13:08:25.642 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:08:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:25.646 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf58a1180-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:08:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:25.646 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 08:08:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:25.647 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf58a1180-80, col_values=(('external_ids', {'iface-id': 'e6453cd1-5272-499c-b5ee-eab27e5bfe29'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:08:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:25.647 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 08:08:25 np0005546954 nova_compute[187160]: 2025-12-05 13:08:25.828 187164 INFO nova.compute.manager [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bdbf5388-d51b-4523-9277-e4b5ce90ed5a] Post operation of migration started#033[00m
Dec  5 08:08:26 np0005546954 nova_compute[187160]: 2025-12-05 13:08:26.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:08:26 np0005546954 nova_compute[187160]: 2025-12-05 13:08:26.124 187164 DEBUG oslo_concurrency.lockutils [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "refresh_cache-bdbf5388-d51b-4523-9277-e4b5ce90ed5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 08:08:26 np0005546954 nova_compute[187160]: 2025-12-05 13:08:26.124 187164 DEBUG oslo_concurrency.lockutils [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquired lock "refresh_cache-bdbf5388-d51b-4523-9277-e4b5ce90ed5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 08:08:26 np0005546954 nova_compute[187160]: 2025-12-05 13:08:26.124 187164 DEBUG nova.network.neutron [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bdbf5388-d51b-4523-9277-e4b5ce90ed5a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 08:08:26 np0005546954 nova_compute[187160]: 2025-12-05 13:08:26.304 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:08:27 np0005546954 nova_compute[187160]: 2025-12-05 13:08:27.476 187164 DEBUG nova.network.neutron [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bdbf5388-d51b-4523-9277-e4b5ce90ed5a] Updating instance_info_cache with network_info: [{"id": "2e70bf4b-d3f4-4499-a0bf-5eee59ffcbfe", "address": "fa:16:3e:e5:5a:7e", "network": {"id": "f58a1180-800c-4113-a42d-53ba7a83ff14", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-2054968628-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cd8844cdffbc42fda56f46bb649ff60d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e70bf4b-d3", "ovs_interfaceid": "2e70bf4b-d3f4-4499-a0bf-5eee59ffcbfe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 08:08:27 np0005546954 nova_compute[187160]: 2025-12-05 13:08:27.494 187164 DEBUG oslo_concurrency.lockutils [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Releasing lock "refresh_cache-bdbf5388-d51b-4523-9277-e4b5ce90ed5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 08:08:27 np0005546954 nova_compute[187160]: 2025-12-05 13:08:27.511 187164 DEBUG oslo_concurrency.lockutils [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:08:27 np0005546954 nova_compute[187160]: 2025-12-05 13:08:27.511 187164 DEBUG oslo_concurrency.lockutils [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:08:27 np0005546954 nova_compute[187160]: 2025-12-05 13:08:27.511 187164 DEBUG oslo_concurrency.lockutils [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:08:27 np0005546954 nova_compute[187160]: 2025-12-05 13:08:27.516 187164 INFO nova.virt.libvirt.driver [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bdbf5388-d51b-4523-9277-e4b5ce90ed5a] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Dec  5 08:08:27 np0005546954 virtqemud[186730]: Domain id=24 name='instance-0000001a' uuid=bdbf5388-d51b-4523-9277-e4b5ce90ed5a is tainted: custom-monitor
Dec  5 08:08:28 np0005546954 nova_compute[187160]: 2025-12-05 13:08:28.523 187164 INFO nova.virt.libvirt.driver [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bdbf5388-d51b-4523-9277-e4b5ce90ed5a] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Dec  5 08:08:29 np0005546954 nova_compute[187160]: 2025-12-05 13:08:29.325 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:08:29 np0005546954 nova_compute[187160]: 2025-12-05 13:08:29.529 187164 INFO nova.virt.libvirt.driver [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bdbf5388-d51b-4523-9277-e4b5ce90ed5a] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Dec  5 08:08:29 np0005546954 nova_compute[187160]: 2025-12-05 13:08:29.537 187164 DEBUG nova.compute.manager [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bdbf5388-d51b-4523-9277-e4b5ce90ed5a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 08:08:29 np0005546954 nova_compute[187160]: 2025-12-05 13:08:29.871 187164 DEBUG nova.objects.instance [None req-405e8f55-da63-447d-8aa1-7969eb57bebe 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bdbf5388-d51b-4523-9277-e4b5ce90ed5a] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Dec  5 08:08:31 np0005546954 nova_compute[187160]: 2025-12-05 13:08:31.306 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:08:34 np0005546954 nova_compute[187160]: 2025-12-05 13:08:34.327 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:08:35 np0005546954 podman[197513]: time="2025-12-05T13:08:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:08:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:08:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  5 08:08:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:08:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3058 "" "Go-http-client/1.1"
Dec  5 08:08:36 np0005546954 nova_compute[187160]: 2025-12-05 13:08:36.309 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:08:36 np0005546954 nova_compute[187160]: 2025-12-05 13:08:36.947 187164 DEBUG oslo_concurrency.lockutils [None req-e93e8130-ecfe-4dda-bc6b-bb91848d4f72 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Acquiring lock "bdbf5388-d51b-4523-9277-e4b5ce90ed5a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:08:36 np0005546954 nova_compute[187160]: 2025-12-05 13:08:36.948 187164 DEBUG oslo_concurrency.lockutils [None req-e93e8130-ecfe-4dda-bc6b-bb91848d4f72 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Lock "bdbf5388-d51b-4523-9277-e4b5ce90ed5a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:08:36 np0005546954 nova_compute[187160]: 2025-12-05 13:08:36.949 187164 DEBUG oslo_concurrency.lockutils [None req-e93e8130-ecfe-4dda-bc6b-bb91848d4f72 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Acquiring lock "bdbf5388-d51b-4523-9277-e4b5ce90ed5a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:08:36 np0005546954 nova_compute[187160]: 2025-12-05 13:08:36.949 187164 DEBUG oslo_concurrency.lockutils [None req-e93e8130-ecfe-4dda-bc6b-bb91848d4f72 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Lock "bdbf5388-d51b-4523-9277-e4b5ce90ed5a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:08:36 np0005546954 nova_compute[187160]: 2025-12-05 13:08:36.950 187164 DEBUG oslo_concurrency.lockutils [None req-e93e8130-ecfe-4dda-bc6b-bb91848d4f72 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Lock "bdbf5388-d51b-4523-9277-e4b5ce90ed5a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:08:36 np0005546954 nova_compute[187160]: 2025-12-05 13:08:36.952 187164 INFO nova.compute.manager [None req-e93e8130-ecfe-4dda-bc6b-bb91848d4f72 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] [instance: bdbf5388-d51b-4523-9277-e4b5ce90ed5a] Terminating instance#033[00m
Dec  5 08:08:36 np0005546954 nova_compute[187160]: 2025-12-05 13:08:36.954 187164 DEBUG nova.compute.manager [None req-e93e8130-ecfe-4dda-bc6b-bb91848d4f72 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] [instance: bdbf5388-d51b-4523-9277-e4b5ce90ed5a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 08:08:36 np0005546954 kernel: tap2e70bf4b-d3 (unregistering): left promiscuous mode
Dec  5 08:08:36 np0005546954 NetworkManager[55665]: <info>  [1764940116.9748] device (tap2e70bf4b-d3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 08:08:36 np0005546954 ovn_controller[95566]: 2025-12-05T13:08:36Z|00248|binding|INFO|Releasing lport 2e70bf4b-d3f4-4499-a0bf-5eee59ffcbfe from this chassis (sb_readonly=0)
Dec  5 08:08:36 np0005546954 ovn_controller[95566]: 2025-12-05T13:08:36Z|00249|binding|INFO|Setting lport 2e70bf4b-d3f4-4499-a0bf-5eee59ffcbfe down in Southbound
Dec  5 08:08:36 np0005546954 nova_compute[187160]: 2025-12-05 13:08:36.980 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:08:36 np0005546954 ovn_controller[95566]: 2025-12-05T13:08:36Z|00250|binding|INFO|Removing iface tap2e70bf4b-d3 ovn-installed in OVS
Dec  5 08:08:37 np0005546954 nova_compute[187160]: 2025-12-05 13:08:37.002 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:08:37 np0005546954 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d0000001a.scope: Deactivated successfully.
Dec  5 08:08:37 np0005546954 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d0000001a.scope: Consumed 1.586s CPU time.
Dec  5 08:08:37 np0005546954 systemd-machined[153497]: Machine qemu-24-instance-0000001a terminated.
Dec  5 08:08:37 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:37.058 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e5:5a:7e 10.100.0.14'], port_security=['fa:16:3e:e5:5a:7e 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'bdbf5388-d51b-4523-9277-e4b5ce90ed5a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f58a1180-800c-4113-a42d-53ba7a83ff14', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cd8844cdffbc42fda56f46bb649ff60d', 'neutron:revision_number': '13', 'neutron:security_group_ids': '58c01b8f-2f37-4998-87d2-d107cf040b9e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=466363b7-72e5-437f-abf8-f548593b34f9, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=2e70bf4b-d3f4-4499-a0bf-5eee59ffcbfe) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 08:08:37 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:37.059 104428 INFO neutron.agent.ovn.metadata.agent [-] Port 2e70bf4b-d3f4-4499-a0bf-5eee59ffcbfe in datapath f58a1180-800c-4113-a42d-53ba7a83ff14 unbound from our chassis#033[00m
Dec  5 08:08:37 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:37.060 104428 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f58a1180-800c-4113-a42d-53ba7a83ff14#033[00m
Dec  5 08:08:37 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:37.077 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[744ec7a0-ccc1-43d6-90fe-e4bc5db8d785]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:08:37 np0005546954 podman[218412]: 2025-12-05 13:08:37.085387709 +0000 UTC m=+0.075197159 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec  5 08:08:37 np0005546954 podman[218410]: 2025-12-05 13:08:37.085361829 +0000 UTC m=+0.075270052 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, name=ubi9-minimal, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec  5 08:08:37 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:37.115 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[35fb727a-787a-4404-ba73-2000b708a515]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:08:37 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:37.117 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[9c4f4ebb-013b-4170-9a9f-64be22d58137]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:08:37 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:37.147 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[33647160-0550-47cb-9512-9077d84adbb7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:08:37 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:37.165 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[e946d14d-6d90-4be9-8fc6-35a63843d4b4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf58a1180-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:49:5d:63'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501364, 'reachable_time': 41115, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218462, 'error': None, 'target': 'ovnmeta-f58a1180-800c-4113-a42d-53ba7a83ff14', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:08:37 np0005546954 nova_compute[187160]: 2025-12-05 13:08:37.176 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:08:37 np0005546954 nova_compute[187160]: 2025-12-05 13:08:37.180 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:08:37 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:37.182 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[dfa61abb-cdee-4a85-ae18-e3030d425fde]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf58a1180-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 501377, 'tstamp': 501377}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218465, 'error': None, 'target': 'ovnmeta-f58a1180-800c-4113-a42d-53ba7a83ff14', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf58a1180-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 501381, 'tstamp': 501381}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218465, 'error': None, 'target': 'ovnmeta-f58a1180-800c-4113-a42d-53ba7a83ff14', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:08:37 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:37.184 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf58a1180-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:08:37 np0005546954 nova_compute[187160]: 2025-12-05 13:08:37.185 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:08:37 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:37.189 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf58a1180-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:08:37 np0005546954 nova_compute[187160]: 2025-12-05 13:08:37.188 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:08:37 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:37.189 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 08:08:37 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:37.189 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf58a1180-80, col_values=(('external_ids', {'iface-id': 'e6453cd1-5272-499c-b5ee-eab27e5bfe29'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:08:37 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:37.190 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 08:08:37 np0005546954 nova_compute[187160]: 2025-12-05 13:08:37.238 187164 INFO nova.virt.libvirt.driver [-] [instance: bdbf5388-d51b-4523-9277-e4b5ce90ed5a] Instance destroyed successfully.#033[00m
Dec  5 08:08:37 np0005546954 nova_compute[187160]: 2025-12-05 13:08:37.239 187164 DEBUG nova.objects.instance [None req-e93e8130-ecfe-4dda-bc6b-bb91848d4f72 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Lazy-loading 'resources' on Instance uuid bdbf5388-d51b-4523-9277-e4b5ce90ed5a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 08:08:37 np0005546954 nova_compute[187160]: 2025-12-05 13:08:37.290 187164 DEBUG nova.virt.libvirt.vif [None req-e93e8130-ecfe-4dda-bc6b-bb91848d4f72 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T13:06:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1515208707',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1515208707',id=26,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T13:07:06Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cd8844cdffbc42fda56f46bb649ff60d',ramdisk_id='',reservation_id='r-sv42ywpi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1594092090',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1594092090-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T13:08:30Z,user_data=None,user_id='2ac3cc621ecc4823aed54d0815090a78',uuid=bdbf5388-d51b-4523-9277-e4b5ce90ed5a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2e70bf4b-d3f4-4499-a0bf-5eee59ffcbfe", "address": "fa:16:3e:e5:5a:7e", "network": {"id": "f58a1180-800c-4113-a42d-53ba7a83ff14", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-2054968628-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cd8844cdffbc42fda56f46bb649ff60d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e70bf4b-d3", "ovs_interfaceid": "2e70bf4b-d3f4-4499-a0bf-5eee59ffcbfe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 08:08:37 np0005546954 nova_compute[187160]: 2025-12-05 13:08:37.291 187164 DEBUG nova.network.os_vif_util [None req-e93e8130-ecfe-4dda-bc6b-bb91848d4f72 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Converting VIF {"id": "2e70bf4b-d3f4-4499-a0bf-5eee59ffcbfe", "address": "fa:16:3e:e5:5a:7e", "network": {"id": "f58a1180-800c-4113-a42d-53ba7a83ff14", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-2054968628-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cd8844cdffbc42fda56f46bb649ff60d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e70bf4b-d3", "ovs_interfaceid": "2e70bf4b-d3f4-4499-a0bf-5eee59ffcbfe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 08:08:37 np0005546954 nova_compute[187160]: 2025-12-05 13:08:37.292 187164 DEBUG nova.network.os_vif_util [None req-e93e8130-ecfe-4dda-bc6b-bb91848d4f72 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e5:5a:7e,bridge_name='br-int',has_traffic_filtering=True,id=2e70bf4b-d3f4-4499-a0bf-5eee59ffcbfe,network=Network(f58a1180-800c-4113-a42d-53ba7a83ff14),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e70bf4b-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 08:08:37 np0005546954 nova_compute[187160]: 2025-12-05 13:08:37.293 187164 DEBUG os_vif [None req-e93e8130-ecfe-4dda-bc6b-bb91848d4f72 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e5:5a:7e,bridge_name='br-int',has_traffic_filtering=True,id=2e70bf4b-d3f4-4499-a0bf-5eee59ffcbfe,network=Network(f58a1180-800c-4113-a42d-53ba7a83ff14),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e70bf4b-d3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 08:08:37 np0005546954 nova_compute[187160]: 2025-12-05 13:08:37.296 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:08:37 np0005546954 nova_compute[187160]: 2025-12-05 13:08:37.297 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2e70bf4b-d3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:08:37 np0005546954 nova_compute[187160]: 2025-12-05 13:08:37.300 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 08:08:37 np0005546954 nova_compute[187160]: 2025-12-05 13:08:37.304 187164 INFO os_vif [None req-e93e8130-ecfe-4dda-bc6b-bb91848d4f72 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e5:5a:7e,bridge_name='br-int',has_traffic_filtering=True,id=2e70bf4b-d3f4-4499-a0bf-5eee59ffcbfe,network=Network(f58a1180-800c-4113-a42d-53ba7a83ff14),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e70bf4b-d3')#033[00m
Dec  5 08:08:37 np0005546954 nova_compute[187160]: 2025-12-05 13:08:37.305 187164 INFO nova.virt.libvirt.driver [None req-e93e8130-ecfe-4dda-bc6b-bb91848d4f72 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] [instance: bdbf5388-d51b-4523-9277-e4b5ce90ed5a] Deleting instance files /var/lib/nova/instances/bdbf5388-d51b-4523-9277-e4b5ce90ed5a_del#033[00m
Dec  5 08:08:37 np0005546954 nova_compute[187160]: 2025-12-05 13:08:37.307 187164 INFO nova.virt.libvirt.driver [None req-e93e8130-ecfe-4dda-bc6b-bb91848d4f72 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] [instance: bdbf5388-d51b-4523-9277-e4b5ce90ed5a] Deletion of /var/lib/nova/instances/bdbf5388-d51b-4523-9277-e4b5ce90ed5a_del complete#033[00m
Dec  5 08:08:37 np0005546954 nova_compute[187160]: 2025-12-05 13:08:37.665 187164 DEBUG nova.compute.manager [req-5ed47039-8d9c-431f-843d-ebc60c0ed6f4 req-f8c62a99-1485-4478-a478-55d79d44209a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bdbf5388-d51b-4523-9277-e4b5ce90ed5a] Received event network-vif-unplugged-2e70bf4b-d3f4-4499-a0bf-5eee59ffcbfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:08:37 np0005546954 nova_compute[187160]: 2025-12-05 13:08:37.666 187164 DEBUG oslo_concurrency.lockutils [req-5ed47039-8d9c-431f-843d-ebc60c0ed6f4 req-f8c62a99-1485-4478-a478-55d79d44209a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "bdbf5388-d51b-4523-9277-e4b5ce90ed5a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:08:37 np0005546954 nova_compute[187160]: 2025-12-05 13:08:37.666 187164 DEBUG oslo_concurrency.lockutils [req-5ed47039-8d9c-431f-843d-ebc60c0ed6f4 req-f8c62a99-1485-4478-a478-55d79d44209a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "bdbf5388-d51b-4523-9277-e4b5ce90ed5a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:08:37 np0005546954 nova_compute[187160]: 2025-12-05 13:08:37.666 187164 DEBUG oslo_concurrency.lockutils [req-5ed47039-8d9c-431f-843d-ebc60c0ed6f4 req-f8c62a99-1485-4478-a478-55d79d44209a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "bdbf5388-d51b-4523-9277-e4b5ce90ed5a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:08:37 np0005546954 nova_compute[187160]: 2025-12-05 13:08:37.667 187164 DEBUG nova.compute.manager [req-5ed47039-8d9c-431f-843d-ebc60c0ed6f4 req-f8c62a99-1485-4478-a478-55d79d44209a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bdbf5388-d51b-4523-9277-e4b5ce90ed5a] No waiting events found dispatching network-vif-unplugged-2e70bf4b-d3f4-4499-a0bf-5eee59ffcbfe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 08:08:37 np0005546954 nova_compute[187160]: 2025-12-05 13:08:37.667 187164 DEBUG nova.compute.manager [req-5ed47039-8d9c-431f-843d-ebc60c0ed6f4 req-f8c62a99-1485-4478-a478-55d79d44209a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bdbf5388-d51b-4523-9277-e4b5ce90ed5a] Received event network-vif-unplugged-2e70bf4b-d3f4-4499-a0bf-5eee59ffcbfe for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  5 08:08:37 np0005546954 nova_compute[187160]: 2025-12-05 13:08:37.821 187164 INFO nova.compute.manager [None req-e93e8130-ecfe-4dda-bc6b-bb91848d4f72 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] [instance: bdbf5388-d51b-4523-9277-e4b5ce90ed5a] Took 0.87 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 08:08:37 np0005546954 nova_compute[187160]: 2025-12-05 13:08:37.822 187164 DEBUG oslo.service.loopingcall [None req-e93e8130-ecfe-4dda-bc6b-bb91848d4f72 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 08:08:37 np0005546954 nova_compute[187160]: 2025-12-05 13:08:37.823 187164 DEBUG nova.compute.manager [-] [instance: bdbf5388-d51b-4523-9277-e4b5ce90ed5a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 08:08:37 np0005546954 nova_compute[187160]: 2025-12-05 13:08:37.823 187164 DEBUG nova.network.neutron [-] [instance: bdbf5388-d51b-4523-9277-e4b5ce90ed5a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 08:08:38 np0005546954 nova_compute[187160]: 2025-12-05 13:08:38.543 187164 DEBUG nova.network.neutron [-] [instance: bdbf5388-d51b-4523-9277-e4b5ce90ed5a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 08:08:38 np0005546954 nova_compute[187160]: 2025-12-05 13:08:38.561 187164 INFO nova.compute.manager [-] [instance: bdbf5388-d51b-4523-9277-e4b5ce90ed5a] Took 0.74 seconds to deallocate network for instance.#033[00m
Dec  5 08:08:38 np0005546954 nova_compute[187160]: 2025-12-05 13:08:38.698 187164 DEBUG oslo_concurrency.lockutils [None req-e93e8130-ecfe-4dda-bc6b-bb91848d4f72 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:08:38 np0005546954 nova_compute[187160]: 2025-12-05 13:08:38.699 187164 DEBUG oslo_concurrency.lockutils [None req-e93e8130-ecfe-4dda-bc6b-bb91848d4f72 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:08:38 np0005546954 nova_compute[187160]: 2025-12-05 13:08:38.706 187164 DEBUG oslo_concurrency.lockutils [None req-e93e8130-ecfe-4dda-bc6b-bb91848d4f72 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.007s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:08:38 np0005546954 nova_compute[187160]: 2025-12-05 13:08:38.740 187164 INFO nova.scheduler.client.report [None req-e93e8130-ecfe-4dda-bc6b-bb91848d4f72 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Deleted allocations for instance bdbf5388-d51b-4523-9277-e4b5ce90ed5a#033[00m
Dec  5 08:08:38 np0005546954 nova_compute[187160]: 2025-12-05 13:08:38.971 187164 DEBUG oslo_concurrency.lockutils [None req-e93e8130-ecfe-4dda-bc6b-bb91848d4f72 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Lock "bdbf5388-d51b-4523-9277-e4b5ce90ed5a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.023s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:08:39 np0005546954 nova_compute[187160]: 2025-12-05 13:08:39.703 187164 DEBUG oslo_concurrency.lockutils [None req-ff1b1486-9da1-498a-9990-82d4a2753c79 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Acquiring lock "ff3a12a0-9d4f-4501-99cd-dc404a5e80b7" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:08:39 np0005546954 nova_compute[187160]: 2025-12-05 13:08:39.704 187164 DEBUG oslo_concurrency.lockutils [None req-ff1b1486-9da1-498a-9990-82d4a2753c79 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Lock "ff3a12a0-9d4f-4501-99cd-dc404a5e80b7" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:08:39 np0005546954 nova_compute[187160]: 2025-12-05 13:08:39.704 187164 DEBUG oslo_concurrency.lockutils [None req-ff1b1486-9da1-498a-9990-82d4a2753c79 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Acquiring lock "ff3a12a0-9d4f-4501-99cd-dc404a5e80b7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:08:39 np0005546954 nova_compute[187160]: 2025-12-05 13:08:39.704 187164 DEBUG oslo_concurrency.lockutils [None req-ff1b1486-9da1-498a-9990-82d4a2753c79 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Lock "ff3a12a0-9d4f-4501-99cd-dc404a5e80b7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:08:39 np0005546954 nova_compute[187160]: 2025-12-05 13:08:39.705 187164 DEBUG oslo_concurrency.lockutils [None req-ff1b1486-9da1-498a-9990-82d4a2753c79 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Lock "ff3a12a0-9d4f-4501-99cd-dc404a5e80b7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:08:39 np0005546954 nova_compute[187160]: 2025-12-05 13:08:39.706 187164 INFO nova.compute.manager [None req-ff1b1486-9da1-498a-9990-82d4a2753c79 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Terminating instance#033[00m
Dec  5 08:08:39 np0005546954 nova_compute[187160]: 2025-12-05 13:08:39.707 187164 DEBUG nova.compute.manager [None req-ff1b1486-9da1-498a-9990-82d4a2753c79 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 08:08:39 np0005546954 kernel: tap3e75686e-2b (unregistering): left promiscuous mode
Dec  5 08:08:39 np0005546954 NetworkManager[55665]: <info>  [1764940119.7440] device (tap3e75686e-2b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 08:08:39 np0005546954 nova_compute[187160]: 2025-12-05 13:08:39.745 187164 DEBUG nova.compute.manager [req-ca4a5999-deba-4ea7-9eb7-f620656c972e req-1e660753-54f2-490e-8a98-8a4a7ff1a73a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bdbf5388-d51b-4523-9277-e4b5ce90ed5a] Received event network-vif-plugged-2e70bf4b-d3f4-4499-a0bf-5eee59ffcbfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:08:39 np0005546954 nova_compute[187160]: 2025-12-05 13:08:39.745 187164 DEBUG oslo_concurrency.lockutils [req-ca4a5999-deba-4ea7-9eb7-f620656c972e req-1e660753-54f2-490e-8a98-8a4a7ff1a73a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "bdbf5388-d51b-4523-9277-e4b5ce90ed5a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:08:39 np0005546954 nova_compute[187160]: 2025-12-05 13:08:39.745 187164 DEBUG oslo_concurrency.lockutils [req-ca4a5999-deba-4ea7-9eb7-f620656c972e req-1e660753-54f2-490e-8a98-8a4a7ff1a73a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "bdbf5388-d51b-4523-9277-e4b5ce90ed5a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:08:39 np0005546954 nova_compute[187160]: 2025-12-05 13:08:39.746 187164 DEBUG oslo_concurrency.lockutils [req-ca4a5999-deba-4ea7-9eb7-f620656c972e req-1e660753-54f2-490e-8a98-8a4a7ff1a73a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "bdbf5388-d51b-4523-9277-e4b5ce90ed5a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:08:39 np0005546954 nova_compute[187160]: 2025-12-05 13:08:39.746 187164 DEBUG nova.compute.manager [req-ca4a5999-deba-4ea7-9eb7-f620656c972e req-1e660753-54f2-490e-8a98-8a4a7ff1a73a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bdbf5388-d51b-4523-9277-e4b5ce90ed5a] No waiting events found dispatching network-vif-plugged-2e70bf4b-d3f4-4499-a0bf-5eee59ffcbfe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 08:08:39 np0005546954 nova_compute[187160]: 2025-12-05 13:08:39.746 187164 WARNING nova.compute.manager [req-ca4a5999-deba-4ea7-9eb7-f620656c972e req-1e660753-54f2-490e-8a98-8a4a7ff1a73a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bdbf5388-d51b-4523-9277-e4b5ce90ed5a] Received unexpected event network-vif-plugged-2e70bf4b-d3f4-4499-a0bf-5eee59ffcbfe for instance with vm_state deleted and task_state None.#033[00m
Dec  5 08:08:39 np0005546954 nova_compute[187160]: 2025-12-05 13:08:39.746 187164 DEBUG nova.compute.manager [req-ca4a5999-deba-4ea7-9eb7-f620656c972e req-1e660753-54f2-490e-8a98-8a4a7ff1a73a 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: bdbf5388-d51b-4523-9277-e4b5ce90ed5a] Received event network-vif-deleted-2e70bf4b-d3f4-4499-a0bf-5eee59ffcbfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:08:39 np0005546954 ovn_controller[95566]: 2025-12-05T13:08:39Z|00251|binding|INFO|Releasing lport 3e75686e-2bb1-41fe-b540-d4013058afc1 from this chassis (sb_readonly=0)
Dec  5 08:08:39 np0005546954 nova_compute[187160]: 2025-12-05 13:08:39.751 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:08:39 np0005546954 ovn_controller[95566]: 2025-12-05T13:08:39Z|00252|binding|INFO|Setting lport 3e75686e-2bb1-41fe-b540-d4013058afc1 down in Southbound
Dec  5 08:08:39 np0005546954 ovn_controller[95566]: 2025-12-05T13:08:39Z|00253|binding|INFO|Removing iface tap3e75686e-2b ovn-installed in OVS
Dec  5 08:08:39 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:39.758 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c3:c2:e7 10.100.0.7'], port_security=['fa:16:3e:c3:c2:e7 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'ff3a12a0-9d4f-4501-99cd-dc404a5e80b7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f58a1180-800c-4113-a42d-53ba7a83ff14', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cd8844cdffbc42fda56f46bb649ff60d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '58c01b8f-2f37-4998-87d2-d107cf040b9e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=466363b7-72e5-437f-abf8-f548593b34f9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=3e75686e-2bb1-41fe-b540-d4013058afc1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 08:08:39 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:39.760 104428 INFO neutron.agent.ovn.metadata.agent [-] Port 3e75686e-2bb1-41fe-b540-d4013058afc1 in datapath f58a1180-800c-4113-a42d-53ba7a83ff14 unbound from our chassis#033[00m
Dec  5 08:08:39 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:39.762 104428 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f58a1180-800c-4113-a42d-53ba7a83ff14, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 08:08:39 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:39.763 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[54ac3f4e-493b-4983-a3ea-90bb5baa86fd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:08:39 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:39.763 104428 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f58a1180-800c-4113-a42d-53ba7a83ff14 namespace which is not needed anymore#033[00m
Dec  5 08:08:39 np0005546954 nova_compute[187160]: 2025-12-05 13:08:39.767 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:08:39 np0005546954 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d00000019.scope: Deactivated successfully.
Dec  5 08:08:39 np0005546954 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d00000019.scope: Consumed 17.352s CPU time.
Dec  5 08:08:39 np0005546954 systemd-machined[153497]: Machine qemu-23-instance-00000019 terminated.
Dec  5 08:08:39 np0005546954 neutron-haproxy-ovnmeta-f58a1180-800c-4113-a42d-53ba7a83ff14[217877]: [NOTICE]   (217888) : haproxy version is 2.8.14-c23fe91
Dec  5 08:08:39 np0005546954 neutron-haproxy-ovnmeta-f58a1180-800c-4113-a42d-53ba7a83ff14[217877]: [NOTICE]   (217888) : path to executable is /usr/sbin/haproxy
Dec  5 08:08:39 np0005546954 neutron-haproxy-ovnmeta-f58a1180-800c-4113-a42d-53ba7a83ff14[217877]: [WARNING]  (217888) : Exiting Master process...
Dec  5 08:08:39 np0005546954 neutron-haproxy-ovnmeta-f58a1180-800c-4113-a42d-53ba7a83ff14[217877]: [WARNING]  (217888) : Exiting Master process...
Dec  5 08:08:39 np0005546954 neutron-haproxy-ovnmeta-f58a1180-800c-4113-a42d-53ba7a83ff14[217877]: [ALERT]    (217888) : Current worker (217890) exited with code 143 (Terminated)
Dec  5 08:08:39 np0005546954 neutron-haproxy-ovnmeta-f58a1180-800c-4113-a42d-53ba7a83ff14[217877]: [WARNING]  (217888) : All workers exited. Exiting... (0)
Dec  5 08:08:39 np0005546954 systemd[1]: libpod-948cffc971e1ebb6aada18d6673574df67ba399fb55bd11a2feb66e08ff74d97.scope: Deactivated successfully.
Dec  5 08:08:39 np0005546954 podman[218505]: 2025-12-05 13:08:39.908227165 +0000 UTC m=+0.044049656 container died 948cffc971e1ebb6aada18d6673574df67ba399fb55bd11a2feb66e08ff74d97 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f58a1180-800c-4113-a42d-53ba7a83ff14, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  5 08:08:39 np0005546954 NetworkManager[55665]: <info>  [1764940119.9271] manager: (tap3e75686e-2b): new Tun device (/org/freedesktop/NetworkManager/Devices/97)
Dec  5 08:08:39 np0005546954 systemd-udevd[218432]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 08:08:39 np0005546954 kernel: tap3e75686e-2b: entered promiscuous mode
Dec  5 08:08:39 np0005546954 kernel: tap3e75686e-2b (unregistering): left promiscuous mode
Dec  5 08:08:39 np0005546954 nova_compute[187160]: 2025-12-05 13:08:39.942 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:08:39 np0005546954 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-948cffc971e1ebb6aada18d6673574df67ba399fb55bd11a2feb66e08ff74d97-userdata-shm.mount: Deactivated successfully.
Dec  5 08:08:39 np0005546954 systemd[1]: var-lib-containers-storage-overlay-b959b82288767c4c2a56741ee32fc66dc9d132e431e590c5fe8e4a6a0a1f3bad-merged.mount: Deactivated successfully.
Dec  5 08:08:39 np0005546954 podman[218505]: 2025-12-05 13:08:39.958232133 +0000 UTC m=+0.094054634 container cleanup 948cffc971e1ebb6aada18d6673574df67ba399fb55bd11a2feb66e08ff74d97 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f58a1180-800c-4113-a42d-53ba7a83ff14, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 08:08:39 np0005546954 systemd[1]: libpod-conmon-948cffc971e1ebb6aada18d6673574df67ba399fb55bd11a2feb66e08ff74d97.scope: Deactivated successfully.
Dec  5 08:08:39 np0005546954 nova_compute[187160]: 2025-12-05 13:08:39.982 187164 INFO nova.virt.libvirt.driver [-] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Instance destroyed successfully.#033[00m
Dec  5 08:08:39 np0005546954 nova_compute[187160]: 2025-12-05 13:08:39.982 187164 DEBUG nova.objects.instance [None req-ff1b1486-9da1-498a-9990-82d4a2753c79 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Lazy-loading 'resources' on Instance uuid ff3a12a0-9d4f-4501-99cd-dc404a5e80b7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 08:08:40 np0005546954 podman[218542]: 2025-12-05 13:08:40.03045217 +0000 UTC m=+0.049077540 container remove 948cffc971e1ebb6aada18d6673574df67ba399fb55bd11a2feb66e08ff74d97 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f58a1180-800c-4113-a42d-53ba7a83ff14, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125)
Dec  5 08:08:40 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:40.036 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[81b98b61-77ae-4e01-b58b-ac6d1fc6cfaf]: (4, ('Fri Dec  5 01:08:39 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f58a1180-800c-4113-a42d-53ba7a83ff14 (948cffc971e1ebb6aada18d6673574df67ba399fb55bd11a2feb66e08ff74d97)\n948cffc971e1ebb6aada18d6673574df67ba399fb55bd11a2feb66e08ff74d97\nFri Dec  5 01:08:39 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f58a1180-800c-4113-a42d-53ba7a83ff14 (948cffc971e1ebb6aada18d6673574df67ba399fb55bd11a2feb66e08ff74d97)\n948cffc971e1ebb6aada18d6673574df67ba399fb55bd11a2feb66e08ff74d97\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:08:40 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:40.038 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[c8bf0965-9887-4033-81a9-0a9397c5e000]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:08:40 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:40.039 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf58a1180-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:08:40 np0005546954 nova_compute[187160]: 2025-12-05 13:08:40.041 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:08:40 np0005546954 kernel: tapf58a1180-80: left promiscuous mode
Dec  5 08:08:40 np0005546954 nova_compute[187160]: 2025-12-05 13:08:40.058 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:08:40 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:40.061 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[3bf27469-7c04-4fd7-8e74-24b2e1becb83]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:08:40 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:40.083 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[c3db3b8c-f07c-4e26-8ce1-f8f8ed01b925]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:08:40 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:40.085 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[fc7d5187-2f68-457f-93a4-a85553fb4335]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:08:40 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:40.101 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[76fab18e-acf0-41c9-9572-b3b5b3e0a07e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501357, 'reachable_time': 17547, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218561, 'error': None, 'target': 'ovnmeta-f58a1180-800c-4113-a42d-53ba7a83ff14', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:08:40 np0005546954 systemd[1]: run-netns-ovnmeta\x2df58a1180\x2d800c\x2d4113\x2da42d\x2d53ba7a83ff14.mount: Deactivated successfully.
Dec  5 08:08:40 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:40.104 104542 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f58a1180-800c-4113-a42d-53ba7a83ff14 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 08:08:40 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:08:40.105 104542 DEBUG oslo.privsep.daemon [-] privsep: reply[341dab95-dd5b-4efa-a76a-6d226f7794e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:08:40 np0005546954 nova_compute[187160]: 2025-12-05 13:08:40.387 187164 DEBUG nova.virt.libvirt.vif [None req-ff1b1486-9da1-498a-9990-82d4a2753c79 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T13:06:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1286551380',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1286551380',id=25,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T13:06:48Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cd8844cdffbc42fda56f46bb649ff60d',ramdisk_id='',reservation_id='r-jxt8u7pv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1594092090',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1594092090-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T13:06:48Z,user_data=None,user_id='2ac3cc621ecc4823aed54d0815090a78',uuid=ff3a12a0-9d4f-4501-99cd-dc404a5e80b7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3e75686e-2bb1-41fe-b540-d4013058afc1", "address": "fa:16:3e:c3:c2:e7", "network": {"id": "f58a1180-800c-4113-a42d-53ba7a83ff14", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-2054968628-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cd8844cdffbc42fda56f46bb649ff60d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e75686e-2b", "ovs_interfaceid": "3e75686e-2bb1-41fe-b540-d4013058afc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 08:08:40 np0005546954 nova_compute[187160]: 2025-12-05 13:08:40.387 187164 DEBUG nova.network.os_vif_util [None req-ff1b1486-9da1-498a-9990-82d4a2753c79 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Converting VIF {"id": "3e75686e-2bb1-41fe-b540-d4013058afc1", "address": "fa:16:3e:c3:c2:e7", "network": {"id": "f58a1180-800c-4113-a42d-53ba7a83ff14", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-2054968628-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cd8844cdffbc42fda56f46bb649ff60d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e75686e-2b", "ovs_interfaceid": "3e75686e-2bb1-41fe-b540-d4013058afc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 08:08:40 np0005546954 nova_compute[187160]: 2025-12-05 13:08:40.388 187164 DEBUG nova.network.os_vif_util [None req-ff1b1486-9da1-498a-9990-82d4a2753c79 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c3:c2:e7,bridge_name='br-int',has_traffic_filtering=True,id=3e75686e-2bb1-41fe-b540-d4013058afc1,network=Network(f58a1180-800c-4113-a42d-53ba7a83ff14),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e75686e-2b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 08:08:40 np0005546954 nova_compute[187160]: 2025-12-05 13:08:40.388 187164 DEBUG os_vif [None req-ff1b1486-9da1-498a-9990-82d4a2753c79 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c3:c2:e7,bridge_name='br-int',has_traffic_filtering=True,id=3e75686e-2bb1-41fe-b540-d4013058afc1,network=Network(f58a1180-800c-4113-a42d-53ba7a83ff14),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e75686e-2b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 08:08:40 np0005546954 nova_compute[187160]: 2025-12-05 13:08:40.390 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:08:40 np0005546954 nova_compute[187160]: 2025-12-05 13:08:40.390 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3e75686e-2b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:08:40 np0005546954 nova_compute[187160]: 2025-12-05 13:08:40.425 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:08:40 np0005546954 nova_compute[187160]: 2025-12-05 13:08:40.427 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:08:40 np0005546954 nova_compute[187160]: 2025-12-05 13:08:40.430 187164 INFO os_vif [None req-ff1b1486-9da1-498a-9990-82d4a2753c79 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c3:c2:e7,bridge_name='br-int',has_traffic_filtering=True,id=3e75686e-2bb1-41fe-b540-d4013058afc1,network=Network(f58a1180-800c-4113-a42d-53ba7a83ff14),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e75686e-2b')#033[00m
Dec  5 08:08:40 np0005546954 nova_compute[187160]: 2025-12-05 13:08:40.430 187164 INFO nova.virt.libvirt.driver [None req-ff1b1486-9da1-498a-9990-82d4a2753c79 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Deleting instance files /var/lib/nova/instances/ff3a12a0-9d4f-4501-99cd-dc404a5e80b7_del#033[00m
Dec  5 08:08:40 np0005546954 nova_compute[187160]: 2025-12-05 13:08:40.431 187164 INFO nova.virt.libvirt.driver [None req-ff1b1486-9da1-498a-9990-82d4a2753c79 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Deletion of /var/lib/nova/instances/ff3a12a0-9d4f-4501-99cd-dc404a5e80b7_del complete#033[00m
Dec  5 08:08:41 np0005546954 nova_compute[187160]: 2025-12-05 13:08:41.019 187164 INFO nova.compute.manager [None req-ff1b1486-9da1-498a-9990-82d4a2753c79 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Took 1.31 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 08:08:41 np0005546954 nova_compute[187160]: 2025-12-05 13:08:41.020 187164 DEBUG oslo.service.loopingcall [None req-ff1b1486-9da1-498a-9990-82d4a2753c79 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 08:08:41 np0005546954 nova_compute[187160]: 2025-12-05 13:08:41.021 187164 DEBUG nova.compute.manager [-] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 08:08:41 np0005546954 nova_compute[187160]: 2025-12-05 13:08:41.021 187164 DEBUG nova.network.neutron [-] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 08:08:41 np0005546954 nova_compute[187160]: 2025-12-05 13:08:41.313 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:08:41 np0005546954 nova_compute[187160]: 2025-12-05 13:08:41.561 187164 DEBUG nova.network.neutron [-] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 08:08:41 np0005546954 nova_compute[187160]: 2025-12-05 13:08:41.581 187164 INFO nova.compute.manager [-] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Took 0.56 seconds to deallocate network for instance.#033[00m
Dec  5 08:08:41 np0005546954 nova_compute[187160]: 2025-12-05 13:08:41.624 187164 DEBUG oslo_concurrency.lockutils [None req-ff1b1486-9da1-498a-9990-82d4a2753c79 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:08:41 np0005546954 nova_compute[187160]: 2025-12-05 13:08:41.625 187164 DEBUG oslo_concurrency.lockutils [None req-ff1b1486-9da1-498a-9990-82d4a2753c79 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:08:41 np0005546954 nova_compute[187160]: 2025-12-05 13:08:41.672 187164 DEBUG nova.compute.provider_tree [None req-ff1b1486-9da1-498a-9990-82d4a2753c79 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 08:08:41 np0005546954 nova_compute[187160]: 2025-12-05 13:08:41.689 187164 DEBUG nova.scheduler.client.report [None req-ff1b1486-9da1-498a-9990-82d4a2753c79 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 08:08:41 np0005546954 nova_compute[187160]: 2025-12-05 13:08:41.713 187164 DEBUG oslo_concurrency.lockutils [None req-ff1b1486-9da1-498a-9990-82d4a2753c79 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.088s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:08:41 np0005546954 nova_compute[187160]: 2025-12-05 13:08:41.736 187164 INFO nova.scheduler.client.report [None req-ff1b1486-9da1-498a-9990-82d4a2753c79 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Deleted allocations for instance ff3a12a0-9d4f-4501-99cd-dc404a5e80b7#033[00m
Dec  5 08:08:41 np0005546954 nova_compute[187160]: 2025-12-05 13:08:41.805 187164 DEBUG oslo_concurrency.lockutils [None req-ff1b1486-9da1-498a-9990-82d4a2753c79 2ac3cc621ecc4823aed54d0815090a78 cd8844cdffbc42fda56f46bb649ff60d - - default default] Lock "ff3a12a0-9d4f-4501-99cd-dc404a5e80b7" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:08:41 np0005546954 nova_compute[187160]: 2025-12-05 13:08:41.826 187164 DEBUG nova.compute.manager [req-8a004fca-8f45-4b41-8f4b-f3a4b57f7458 req-e201c06e-3f80-48e5-8556-8b3f0a2a910b 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Received event network-vif-unplugged-3e75686e-2bb1-41fe-b540-d4013058afc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:08:41 np0005546954 nova_compute[187160]: 2025-12-05 13:08:41.827 187164 DEBUG oslo_concurrency.lockutils [req-8a004fca-8f45-4b41-8f4b-f3a4b57f7458 req-e201c06e-3f80-48e5-8556-8b3f0a2a910b 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "ff3a12a0-9d4f-4501-99cd-dc404a5e80b7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:08:41 np0005546954 nova_compute[187160]: 2025-12-05 13:08:41.827 187164 DEBUG oslo_concurrency.lockutils [req-8a004fca-8f45-4b41-8f4b-f3a4b57f7458 req-e201c06e-3f80-48e5-8556-8b3f0a2a910b 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "ff3a12a0-9d4f-4501-99cd-dc404a5e80b7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:08:41 np0005546954 nova_compute[187160]: 2025-12-05 13:08:41.828 187164 DEBUG oslo_concurrency.lockutils [req-8a004fca-8f45-4b41-8f4b-f3a4b57f7458 req-e201c06e-3f80-48e5-8556-8b3f0a2a910b 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "ff3a12a0-9d4f-4501-99cd-dc404a5e80b7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:08:41 np0005546954 nova_compute[187160]: 2025-12-05 13:08:41.828 187164 DEBUG nova.compute.manager [req-8a004fca-8f45-4b41-8f4b-f3a4b57f7458 req-e201c06e-3f80-48e5-8556-8b3f0a2a910b 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] No waiting events found dispatching network-vif-unplugged-3e75686e-2bb1-41fe-b540-d4013058afc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 08:08:41 np0005546954 nova_compute[187160]: 2025-12-05 13:08:41.829 187164 WARNING nova.compute.manager [req-8a004fca-8f45-4b41-8f4b-f3a4b57f7458 req-e201c06e-3f80-48e5-8556-8b3f0a2a910b 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Received unexpected event network-vif-unplugged-3e75686e-2bb1-41fe-b540-d4013058afc1 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 08:08:41 np0005546954 nova_compute[187160]: 2025-12-05 13:08:41.829 187164 DEBUG nova.compute.manager [req-8a004fca-8f45-4b41-8f4b-f3a4b57f7458 req-e201c06e-3f80-48e5-8556-8b3f0a2a910b 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Received event network-vif-plugged-3e75686e-2bb1-41fe-b540-d4013058afc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:08:41 np0005546954 nova_compute[187160]: 2025-12-05 13:08:41.830 187164 DEBUG oslo_concurrency.lockutils [req-8a004fca-8f45-4b41-8f4b-f3a4b57f7458 req-e201c06e-3f80-48e5-8556-8b3f0a2a910b 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "ff3a12a0-9d4f-4501-99cd-dc404a5e80b7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:08:41 np0005546954 nova_compute[187160]: 2025-12-05 13:08:41.830 187164 DEBUG oslo_concurrency.lockutils [req-8a004fca-8f45-4b41-8f4b-f3a4b57f7458 req-e201c06e-3f80-48e5-8556-8b3f0a2a910b 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "ff3a12a0-9d4f-4501-99cd-dc404a5e80b7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:08:41 np0005546954 nova_compute[187160]: 2025-12-05 13:08:41.831 187164 DEBUG oslo_concurrency.lockutils [req-8a004fca-8f45-4b41-8f4b-f3a4b57f7458 req-e201c06e-3f80-48e5-8556-8b3f0a2a910b 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "ff3a12a0-9d4f-4501-99cd-dc404a5e80b7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:08:41 np0005546954 nova_compute[187160]: 2025-12-05 13:08:41.831 187164 DEBUG nova.compute.manager [req-8a004fca-8f45-4b41-8f4b-f3a4b57f7458 req-e201c06e-3f80-48e5-8556-8b3f0a2a910b 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] No waiting events found dispatching network-vif-plugged-3e75686e-2bb1-41fe-b540-d4013058afc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 08:08:41 np0005546954 nova_compute[187160]: 2025-12-05 13:08:41.832 187164 WARNING nova.compute.manager [req-8a004fca-8f45-4b41-8f4b-f3a4b57f7458 req-e201c06e-3f80-48e5-8556-8b3f0a2a910b 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Received unexpected event network-vif-plugged-3e75686e-2bb1-41fe-b540-d4013058afc1 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 08:08:41 np0005546954 nova_compute[187160]: 2025-12-05 13:08:41.833 187164 DEBUG nova.compute.manager [req-8a004fca-8f45-4b41-8f4b-f3a4b57f7458 req-e201c06e-3f80-48e5-8556-8b3f0a2a910b 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Received event network-vif-deleted-3e75686e-2bb1-41fe-b540-d4013058afc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:08:45 np0005546954 nova_compute[187160]: 2025-12-05 13:08:45.425 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:08:46 np0005546954 nova_compute[187160]: 2025-12-05 13:08:46.313 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:08:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:08:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:08:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:08:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:08:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:08:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:08:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:08:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:08:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:08:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:08:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:08:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:08:50 np0005546954 nova_compute[187160]: 2025-12-05 13:08:50.427 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:08:50 np0005546954 podman[218568]: 2025-12-05 13:08:50.580435359 +0000 UTC m=+0.080798564 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec  5 08:08:51 np0005546954 nova_compute[187160]: 2025-12-05 13:08:51.317 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:08:52 np0005546954 nova_compute[187160]: 2025-12-05 13:08:52.236 187164 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764940117.2340236, bdbf5388-d51b-4523-9277-e4b5ce90ed5a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 08:08:52 np0005546954 nova_compute[187160]: 2025-12-05 13:08:52.236 187164 INFO nova.compute.manager [-] [instance: bdbf5388-d51b-4523-9277-e4b5ce90ed5a] VM Stopped (Lifecycle Event)#033[00m
Dec  5 08:08:52 np0005546954 nova_compute[187160]: 2025-12-05 13:08:52.266 187164 DEBUG nova.compute.manager [None req-c8a4f9e6-c863-4710-bc54-e283c2c6124f - - - - - -] [instance: bdbf5388-d51b-4523-9277-e4b5ce90ed5a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 08:08:54 np0005546954 nova_compute[187160]: 2025-12-05 13:08:54.982 187164 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764940119.9801917, ff3a12a0-9d4f-4501-99cd-dc404a5e80b7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 08:08:54 np0005546954 nova_compute[187160]: 2025-12-05 13:08:54.983 187164 INFO nova.compute.manager [-] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] VM Stopped (Lifecycle Event)#033[00m
Dec  5 08:08:55 np0005546954 nova_compute[187160]: 2025-12-05 13:08:55.308 187164 DEBUG nova.compute.manager [None req-efa495d6-aac6-42f6-bd96-7ba7481b4b0a - - - - - -] [instance: ff3a12a0-9d4f-4501-99cd-dc404a5e80b7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 08:08:55 np0005546954 nova_compute[187160]: 2025-12-05 13:08:55.429 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:08:56 np0005546954 nova_compute[187160]: 2025-12-05 13:08:56.319 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:08:56 np0005546954 podman[218590]: 2025-12-05 13:08:56.55805949 +0000 UTC m=+0.056170221 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  5 08:08:56 np0005546954 podman[218589]: 2025-12-05 13:08:56.588906285 +0000 UTC m=+0.100295197 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  5 08:09:00 np0005546954 nova_compute[187160]: 2025-12-05 13:09:00.431 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:09:01 np0005546954 nova_compute[187160]: 2025-12-05 13:09:01.321 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:09:05 np0005546954 nova_compute[187160]: 2025-12-05 13:09:05.432 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:09:05 np0005546954 podman[197513]: time="2025-12-05T13:09:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:09:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:09:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 08:09:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:09:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2599 "" "Go-http-client/1.1"
Dec  5 08:09:06 np0005546954 nova_compute[187160]: 2025-12-05 13:09:06.325 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:09:07 np0005546954 podman[218641]: 2025-12-05 13:09:07.561531034 +0000 UTC m=+0.073682403 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, architecture=x86_64, name=ubi9-minimal, maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, config_id=edpm, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-type=git)
Dec  5 08:09:07 np0005546954 podman[218642]: 2025-12-05 13:09:07.586045323 +0000 UTC m=+0.086823849 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Dec  5 08:09:07 np0005546954 nova_compute[187160]: 2025-12-05 13:09:07.959 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:09:10 np0005546954 nova_compute[187160]: 2025-12-05 13:09:10.434 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:09:11 np0005546954 nova_compute[187160]: 2025-12-05 13:09:11.326 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:09:15 np0005546954 nova_compute[187160]: 2025-12-05 13:09:15.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:09:15 np0005546954 nova_compute[187160]: 2025-12-05 13:09:15.039 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 08:09:15 np0005546954 nova_compute[187160]: 2025-12-05 13:09:15.039 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 08:09:15 np0005546954 nova_compute[187160]: 2025-12-05 13:09:15.123 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 08:09:15 np0005546954 nova_compute[187160]: 2025-12-05 13:09:15.436 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:09:16 np0005546954 nova_compute[187160]: 2025-12-05 13:09:16.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:09:16 np0005546954 nova_compute[187160]: 2025-12-05 13:09:16.327 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:09:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:09:16.973 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:09:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:09:16.974 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:09:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:09:16.974 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:09:18 np0005546954 nova_compute[187160]: 2025-12-05 13:09:18.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:09:18 np0005546954 nova_compute[187160]: 2025-12-05 13:09:18.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:09:19 np0005546954 nova_compute[187160]: 2025-12-05 13:09:19.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:09:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:09:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:09:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:09:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:09:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:09:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:09:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:09:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:09:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:09:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:09:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:09:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:09:20 np0005546954 nova_compute[187160]: 2025-12-05 13:09:20.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:09:20 np0005546954 nova_compute[187160]: 2025-12-05 13:09:20.062 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:09:20 np0005546954 nova_compute[187160]: 2025-12-05 13:09:20.063 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:09:20 np0005546954 nova_compute[187160]: 2025-12-05 13:09:20.063 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:09:20 np0005546954 nova_compute[187160]: 2025-12-05 13:09:20.064 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 08:09:20 np0005546954 nova_compute[187160]: 2025-12-05 13:09:20.266 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 08:09:20 np0005546954 nova_compute[187160]: 2025-12-05 13:09:20.267 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5867MB free_disk=73.32955551147461GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 08:09:20 np0005546954 nova_compute[187160]: 2025-12-05 13:09:20.268 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:09:20 np0005546954 nova_compute[187160]: 2025-12-05 13:09:20.268 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:09:20 np0005546954 nova_compute[187160]: 2025-12-05 13:09:20.355 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 08:09:20 np0005546954 nova_compute[187160]: 2025-12-05 13:09:20.356 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 08:09:20 np0005546954 nova_compute[187160]: 2025-12-05 13:09:20.370 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Refreshing inventories for resource provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  5 08:09:20 np0005546954 nova_compute[187160]: 2025-12-05 13:09:20.384 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Updating ProviderTree inventory for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  5 08:09:20 np0005546954 nova_compute[187160]: 2025-12-05 13:09:20.385 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Updating inventory in ProviderTree for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  5 08:09:20 np0005546954 nova_compute[187160]: 2025-12-05 13:09:20.400 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Refreshing aggregate associations for resource provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  5 08:09:20 np0005546954 nova_compute[187160]: 2025-12-05 13:09:20.437 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Refreshing trait associations for resource provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b, traits: COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_IDE,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_2_0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  5 08:09:20 np0005546954 nova_compute[187160]: 2025-12-05 13:09:20.440 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:09:20 np0005546954 nova_compute[187160]: 2025-12-05 13:09:20.461 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 08:09:20 np0005546954 nova_compute[187160]: 2025-12-05 13:09:20.476 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 08:09:20 np0005546954 nova_compute[187160]: 2025-12-05 13:09:20.493 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 08:09:20 np0005546954 nova_compute[187160]: 2025-12-05 13:09:20.493 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.226s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:09:21 np0005546954 nova_compute[187160]: 2025-12-05 13:09:21.369 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:09:21 np0005546954 nova_compute[187160]: 2025-12-05 13:09:21.489 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:09:21 np0005546954 podman[218683]: 2025-12-05 13:09:21.546940202 +0000 UTC m=+0.054182629 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Dec  5 08:09:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:09:23.440 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2a:56:46', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '32:90:88:ab:74:32'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 08:09:23 np0005546954 nova_compute[187160]: 2025-12-05 13:09:23.441 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:09:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:09:23.442 104428 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 08:09:25 np0005546954 nova_compute[187160]: 2025-12-05 13:09:25.441 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:09:26 np0005546954 nova_compute[187160]: 2025-12-05 13:09:26.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:09:26 np0005546954 nova_compute[187160]: 2025-12-05 13:09:26.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 08:09:26 np0005546954 nova_compute[187160]: 2025-12-05 13:09:26.370 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:09:27 np0005546954 nova_compute[187160]: 2025-12-05 13:09:27.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:09:27 np0005546954 podman[218703]: 2025-12-05 13:09:27.621011879 +0000 UTC m=+0.108768969 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  5 08:09:27 np0005546954 podman[218702]: 2025-12-05 13:09:27.647028834 +0000 UTC m=+0.155424354 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  5 08:09:28 np0005546954 nova_compute[187160]: 2025-12-05 13:09:28.034 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:09:28 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:09:28.445 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f9f74c-08f9-451f-9678-93bb9e8fa2fe, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:09:30 np0005546954 nova_compute[187160]: 2025-12-05 13:09:30.443 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:09:31 np0005546954 nova_compute[187160]: 2025-12-05 13:09:31.373 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:09:35 np0005546954 nova_compute[187160]: 2025-12-05 13:09:35.447 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:09:35 np0005546954 podman[197513]: time="2025-12-05T13:09:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:09:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:09:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 08:09:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:09:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2591 "" "Go-http-client/1.1"
Dec  5 08:09:36 np0005546954 nova_compute[187160]: 2025-12-05 13:09:36.375 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:09:38 np0005546954 podman[218754]: 2025-12-05 13:09:38.560693188 +0000 UTC m=+0.071935198 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  5 08:09:38 np0005546954 podman[218753]: 2025-12-05 13:09:38.566196999 +0000 UTC m=+0.079782242 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, build-date=2025-08-20T13:12:41, distribution-scope=public, architecture=x86_64, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc.)
Dec  5 08:09:40 np0005546954 nova_compute[187160]: 2025-12-05 13:09:40.449 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:09:41 np0005546954 nova_compute[187160]: 2025-12-05 13:09:41.377 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:09:45 np0005546954 nova_compute[187160]: 2025-12-05 13:09:45.452 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:09:46 np0005546954 nova_compute[187160]: 2025-12-05 13:09:46.379 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:09:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:09:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:09:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:09:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:09:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:09:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:09:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:09:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:09:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:09:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:09:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:09:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:09:50 np0005546954 ovn_controller[95566]: 2025-12-05T13:09:50Z|00254|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Dec  5 08:09:50 np0005546954 nova_compute[187160]: 2025-12-05 13:09:50.453 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:09:51 np0005546954 nova_compute[187160]: 2025-12-05 13:09:51.383 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:09:52 np0005546954 podman[218798]: 2025-12-05 13:09:52.594001538 +0000 UTC m=+0.088094159 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  5 08:09:55 np0005546954 nova_compute[187160]: 2025-12-05 13:09:55.455 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:09:56 np0005546954 nova_compute[187160]: 2025-12-05 13:09:56.384 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:09:57 np0005546954 nova_compute[187160]: 2025-12-05 13:09:57.323 187164 DEBUG oslo_concurrency.lockutils [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Acquiring lock "769e4da5-19b5-4fc2-a50f-79ae93eaac71" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:09:57 np0005546954 nova_compute[187160]: 2025-12-05 13:09:57.323 187164 DEBUG oslo_concurrency.lockutils [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Lock "769e4da5-19b5-4fc2-a50f-79ae93eaac71" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:09:57 np0005546954 nova_compute[187160]: 2025-12-05 13:09:57.336 187164 DEBUG nova.compute.manager [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 08:09:57 np0005546954 nova_compute[187160]: 2025-12-05 13:09:57.404 187164 DEBUG oslo_concurrency.lockutils [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:09:57 np0005546954 nova_compute[187160]: 2025-12-05 13:09:57.404 187164 DEBUG oslo_concurrency.lockutils [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:09:57 np0005546954 nova_compute[187160]: 2025-12-05 13:09:57.411 187164 DEBUG nova.virt.hardware [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 08:09:57 np0005546954 nova_compute[187160]: 2025-12-05 13:09:57.411 187164 INFO nova.compute.claims [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Claim successful on node compute-1.ctlplane.example.com#033[00m
Dec  5 08:09:57 np0005546954 nova_compute[187160]: 2025-12-05 13:09:57.501 187164 DEBUG nova.compute.provider_tree [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 08:09:57 np0005546954 nova_compute[187160]: 2025-12-05 13:09:57.515 187164 DEBUG nova.scheduler.client.report [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 08:09:57 np0005546954 nova_compute[187160]: 2025-12-05 13:09:57.534 187164 DEBUG oslo_concurrency.lockutils [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:09:57 np0005546954 nova_compute[187160]: 2025-12-05 13:09:57.535 187164 DEBUG nova.compute.manager [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 08:09:57 np0005546954 nova_compute[187160]: 2025-12-05 13:09:57.574 187164 DEBUG nova.compute.manager [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 08:09:57 np0005546954 nova_compute[187160]: 2025-12-05 13:09:57.574 187164 DEBUG nova.network.neutron [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 08:09:57 np0005546954 nova_compute[187160]: 2025-12-05 13:09:57.594 187164 INFO nova.virt.libvirt.driver [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 08:09:57 np0005546954 nova_compute[187160]: 2025-12-05 13:09:57.614 187164 DEBUG nova.compute.manager [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 08:09:57 np0005546954 nova_compute[187160]: 2025-12-05 13:09:57.705 187164 DEBUG nova.compute.manager [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 08:09:57 np0005546954 nova_compute[187160]: 2025-12-05 13:09:57.708 187164 DEBUG nova.virt.libvirt.driver [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 08:09:57 np0005546954 nova_compute[187160]: 2025-12-05 13:09:57.708 187164 INFO nova.virt.libvirt.driver [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Creating image(s)#033[00m
Dec  5 08:09:57 np0005546954 nova_compute[187160]: 2025-12-05 13:09:57.709 187164 DEBUG oslo_concurrency.lockutils [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Acquiring lock "/var/lib/nova/instances/769e4da5-19b5-4fc2-a50f-79ae93eaac71/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:09:57 np0005546954 nova_compute[187160]: 2025-12-05 13:09:57.709 187164 DEBUG oslo_concurrency.lockutils [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Lock "/var/lib/nova/instances/769e4da5-19b5-4fc2-a50f-79ae93eaac71/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:09:57 np0005546954 nova_compute[187160]: 2025-12-05 13:09:57.710 187164 DEBUG oslo_concurrency.lockutils [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Lock "/var/lib/nova/instances/769e4da5-19b5-4fc2-a50f-79ae93eaac71/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:09:57 np0005546954 nova_compute[187160]: 2025-12-05 13:09:57.724 187164 DEBUG oslo_concurrency.processutils [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:09:57 np0005546954 nova_compute[187160]: 2025-12-05 13:09:57.794 187164 DEBUG oslo_concurrency.processutils [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:09:57 np0005546954 nova_compute[187160]: 2025-12-05 13:09:57.796 187164 DEBUG oslo_concurrency.lockutils [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Acquiring lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:09:57 np0005546954 nova_compute[187160]: 2025-12-05 13:09:57.796 187164 DEBUG oslo_concurrency.lockutils [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:09:57 np0005546954 nova_compute[187160]: 2025-12-05 13:09:57.813 187164 DEBUG oslo_concurrency.processutils [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:09:57 np0005546954 nova_compute[187160]: 2025-12-05 13:09:57.877 187164 DEBUG oslo_concurrency.processutils [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:09:57 np0005546954 nova_compute[187160]: 2025-12-05 13:09:57.878 187164 DEBUG oslo_concurrency.processutils [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/769e4da5-19b5-4fc2-a50f-79ae93eaac71/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:09:57 np0005546954 nova_compute[187160]: 2025-12-05 13:09:57.919 187164 DEBUG oslo_concurrency.processutils [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/769e4da5-19b5-4fc2-a50f-79ae93eaac71/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:09:57 np0005546954 nova_compute[187160]: 2025-12-05 13:09:57.920 187164 DEBUG oslo_concurrency.lockutils [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:09:57 np0005546954 nova_compute[187160]: 2025-12-05 13:09:57.921 187164 DEBUG oslo_concurrency.processutils [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:09:57 np0005546954 nova_compute[187160]: 2025-12-05 13:09:57.985 187164 DEBUG oslo_concurrency.processutils [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:09:57 np0005546954 nova_compute[187160]: 2025-12-05 13:09:57.986 187164 DEBUG nova.virt.disk.api [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Checking if we can resize image /var/lib/nova/instances/769e4da5-19b5-4fc2-a50f-79ae93eaac71/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 08:09:57 np0005546954 nova_compute[187160]: 2025-12-05 13:09:57.987 187164 DEBUG oslo_concurrency.processutils [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/769e4da5-19b5-4fc2-a50f-79ae93eaac71/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:09:58 np0005546954 nova_compute[187160]: 2025-12-05 13:09:58.018 187164 DEBUG nova.policy [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4dd51daacb354f619a7413f8b910294c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '876dc65bd147462083d316c47ce85516', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 08:09:58 np0005546954 nova_compute[187160]: 2025-12-05 13:09:58.040 187164 DEBUG oslo_concurrency.processutils [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/769e4da5-19b5-4fc2-a50f-79ae93eaac71/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:09:58 np0005546954 nova_compute[187160]: 2025-12-05 13:09:58.042 187164 DEBUG nova.virt.disk.api [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Cannot resize image /var/lib/nova/instances/769e4da5-19b5-4fc2-a50f-79ae93eaac71/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 08:09:58 np0005546954 nova_compute[187160]: 2025-12-05 13:09:58.042 187164 DEBUG nova.objects.instance [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Lazy-loading 'migration_context' on Instance uuid 769e4da5-19b5-4fc2-a50f-79ae93eaac71 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 08:09:58 np0005546954 nova_compute[187160]: 2025-12-05 13:09:58.058 187164 DEBUG nova.virt.libvirt.driver [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 08:09:58 np0005546954 nova_compute[187160]: 2025-12-05 13:09:58.059 187164 DEBUG nova.virt.libvirt.driver [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Ensure instance console log exists: /var/lib/nova/instances/769e4da5-19b5-4fc2-a50f-79ae93eaac71/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 08:09:58 np0005546954 nova_compute[187160]: 2025-12-05 13:09:58.060 187164 DEBUG oslo_concurrency.lockutils [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:09:58 np0005546954 nova_compute[187160]: 2025-12-05 13:09:58.060 187164 DEBUG oslo_concurrency.lockutils [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:09:58 np0005546954 nova_compute[187160]: 2025-12-05 13:09:58.061 187164 DEBUG oslo_concurrency.lockutils [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:09:58 np0005546954 nova_compute[187160]: 2025-12-05 13:09:58.488 187164 DEBUG nova.network.neutron [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Successfully created port: bf6c2a48-999b-4820-9631-dd3dfff5caac _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 08:09:58 np0005546954 podman[218833]: 2025-12-05 13:09:58.55129878 +0000 UTC m=+0.058806472 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  5 08:09:58 np0005546954 podman[218832]: 2025-12-05 13:09:58.57067298 +0000 UTC m=+0.084707465 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  5 08:09:59 np0005546954 nova_compute[187160]: 2025-12-05 13:09:59.411 187164 DEBUG nova.network.neutron [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Successfully updated port: bf6c2a48-999b-4820-9631-dd3dfff5caac _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 08:09:59 np0005546954 nova_compute[187160]: 2025-12-05 13:09:59.429 187164 DEBUG oslo_concurrency.lockutils [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Acquiring lock "refresh_cache-769e4da5-19b5-4fc2-a50f-79ae93eaac71" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 08:09:59 np0005546954 nova_compute[187160]: 2025-12-05 13:09:59.429 187164 DEBUG oslo_concurrency.lockutils [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Acquired lock "refresh_cache-769e4da5-19b5-4fc2-a50f-79ae93eaac71" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 08:09:59 np0005546954 nova_compute[187160]: 2025-12-05 13:09:59.430 187164 DEBUG nova.network.neutron [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 08:09:59 np0005546954 nova_compute[187160]: 2025-12-05 13:09:59.532 187164 DEBUG nova.compute.manager [req-21ea5feb-d870-4ffe-b6b4-c4876e081174 req-755ac24a-a7c3-4cb1-96cf-aaffe55316bf 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Received event network-changed-bf6c2a48-999b-4820-9631-dd3dfff5caac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:09:59 np0005546954 nova_compute[187160]: 2025-12-05 13:09:59.532 187164 DEBUG nova.compute.manager [req-21ea5feb-d870-4ffe-b6b4-c4876e081174 req-755ac24a-a7c3-4cb1-96cf-aaffe55316bf 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Refreshing instance network info cache due to event network-changed-bf6c2a48-999b-4820-9631-dd3dfff5caac. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 08:09:59 np0005546954 nova_compute[187160]: 2025-12-05 13:09:59.532 187164 DEBUG oslo_concurrency.lockutils [req-21ea5feb-d870-4ffe-b6b4-c4876e081174 req-755ac24a-a7c3-4cb1-96cf-aaffe55316bf 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "refresh_cache-769e4da5-19b5-4fc2-a50f-79ae93eaac71" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 08:09:59 np0005546954 nova_compute[187160]: 2025-12-05 13:09:59.581 187164 DEBUG nova.network.neutron [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 08:10:00 np0005546954 nova_compute[187160]: 2025-12-05 13:10:00.457 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.043 187164 DEBUG nova.network.neutron [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Updating instance_info_cache with network_info: [{"id": "bf6c2a48-999b-4820-9631-dd3dfff5caac", "address": "fa:16:3e:e4:68:0e", "network": {"id": "b5e62496-3107-4a7c-9c99-52197cb36a02", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2065334195-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "876dc65bd147462083d316c47ce85516", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf6c2a48-99", "ovs_interfaceid": "bf6c2a48-999b-4820-9631-dd3dfff5caac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.064 187164 DEBUG oslo_concurrency.lockutils [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Releasing lock "refresh_cache-769e4da5-19b5-4fc2-a50f-79ae93eaac71" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.064 187164 DEBUG nova.compute.manager [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Instance network_info: |[{"id": "bf6c2a48-999b-4820-9631-dd3dfff5caac", "address": "fa:16:3e:e4:68:0e", "network": {"id": "b5e62496-3107-4a7c-9c99-52197cb36a02", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2065334195-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "876dc65bd147462083d316c47ce85516", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf6c2a48-99", "ovs_interfaceid": "bf6c2a48-999b-4820-9631-dd3dfff5caac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.064 187164 DEBUG oslo_concurrency.lockutils [req-21ea5feb-d870-4ffe-b6b4-c4876e081174 req-755ac24a-a7c3-4cb1-96cf-aaffe55316bf 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquired lock "refresh_cache-769e4da5-19b5-4fc2-a50f-79ae93eaac71" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.065 187164 DEBUG nova.network.neutron [req-21ea5feb-d870-4ffe-b6b4-c4876e081174 req-755ac24a-a7c3-4cb1-96cf-aaffe55316bf 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Refreshing network info cache for port bf6c2a48-999b-4820-9631-dd3dfff5caac _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.068 187164 DEBUG nova.virt.libvirt.driver [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Start _get_guest_xml network_info=[{"id": "bf6c2a48-999b-4820-9631-dd3dfff5caac", "address": "fa:16:3e:e4:68:0e", "network": {"id": "b5e62496-3107-4a7c-9c99-52197cb36a02", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2065334195-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "876dc65bd147462083d316c47ce85516", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf6c2a48-99", "ovs_interfaceid": "bf6c2a48-999b-4820-9631-dd3dfff5caac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T12:39:17Z,direct_url=<?>,disk_format='qcow2',id=f4c3125a-6fd0-40bb-aa00-a7e736ee853d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='83916c53de6f404f91206339303e1b23',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T12:39:21Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'size': 0, 'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'encrypted': False, 'image_id': 'f4c3125a-6fd0-40bb-aa00-a7e736ee853d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.073 187164 WARNING nova.virt.libvirt.driver [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.078 187164 DEBUG nova.virt.libvirt.host [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.079 187164 DEBUG nova.virt.libvirt.host [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.085 187164 DEBUG nova.virt.libvirt.host [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.086 187164 DEBUG nova.virt.libvirt.host [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.087 187164 DEBUG nova.virt.libvirt.driver [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.087 187164 DEBUG nova.virt.hardware [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T12:39:15Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b4ea63be-97f8-4a48-b000-66321c4ddb27',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T12:39:17Z,direct_url=<?>,disk_format='qcow2',id=f4c3125a-6fd0-40bb-aa00-a7e736ee853d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='83916c53de6f404f91206339303e1b23',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T12:39:21Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.088 187164 DEBUG nova.virt.hardware [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.088 187164 DEBUG nova.virt.hardware [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.089 187164 DEBUG nova.virt.hardware [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.089 187164 DEBUG nova.virt.hardware [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.089 187164 DEBUG nova.virt.hardware [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.089 187164 DEBUG nova.virt.hardware [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.090 187164 DEBUG nova.virt.hardware [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.090 187164 DEBUG nova.virt.hardware [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.090 187164 DEBUG nova.virt.hardware [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.091 187164 DEBUG nova.virt.hardware [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.096 187164 DEBUG nova.virt.libvirt.vif [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T13:09:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-1045316080',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-1045316080',id=27,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='876dc65bd147462083d316c47ce85516',ramdisk_id='',reservation_id='r-1t1mv4oh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-989128504',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-989128504-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T13:09:57Z,user_data=None,user_id='4dd51daacb354f619a7413f8b910294c',uuid=769e4da5-19b5-4fc2-a50f-79ae93eaac71,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bf6c2a48-999b-4820-9631-dd3dfff5caac", "address": "fa:16:3e:e4:68:0e", "network": {"id": "b5e62496-3107-4a7c-9c99-52197cb36a02", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2065334195-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "876dc65bd147462083d316c47ce85516", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf6c2a48-99", "ovs_interfaceid": "bf6c2a48-999b-4820-9631-dd3dfff5caac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.096 187164 DEBUG nova.network.os_vif_util [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Converting VIF {"id": "bf6c2a48-999b-4820-9631-dd3dfff5caac", "address": "fa:16:3e:e4:68:0e", "network": {"id": "b5e62496-3107-4a7c-9c99-52197cb36a02", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2065334195-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "876dc65bd147462083d316c47ce85516", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf6c2a48-99", "ovs_interfaceid": "bf6c2a48-999b-4820-9631-dd3dfff5caac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.097 187164 DEBUG nova.network.os_vif_util [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e4:68:0e,bridge_name='br-int',has_traffic_filtering=True,id=bf6c2a48-999b-4820-9631-dd3dfff5caac,network=Network(b5e62496-3107-4a7c-9c99-52197cb36a02),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbf6c2a48-99') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.098 187164 DEBUG nova.objects.instance [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Lazy-loading 'pci_devices' on Instance uuid 769e4da5-19b5-4fc2-a50f-79ae93eaac71 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.111 187164 DEBUG nova.virt.libvirt.driver [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] End _get_guest_xml xml=<domain type="kvm">
Dec  5 08:10:01 np0005546954 nova_compute[187160]:  <uuid>769e4da5-19b5-4fc2-a50f-79ae93eaac71</uuid>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:  <name>instance-0000001b</name>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:  <memory>131072</memory>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:  <vcpu>1</vcpu>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:  <metadata>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 08:10:01 np0005546954 nova_compute[187160]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:      <nova:name>tempest-TestExecuteWorkloadBalancingStrategy-server-1045316080</nova:name>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:      <nova:creationTime>2025-12-05 13:10:01</nova:creationTime>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:      <nova:flavor name="m1.nano">
Dec  5 08:10:01 np0005546954 nova_compute[187160]:        <nova:memory>128</nova:memory>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:        <nova:disk>1</nova:disk>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:        <nova:swap>0</nova:swap>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:        <nova:vcpus>1</nova:vcpus>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:      </nova:flavor>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:      <nova:owner>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:        <nova:user uuid="4dd51daacb354f619a7413f8b910294c">tempest-TestExecuteWorkloadBalancingStrategy-989128504-project-member</nova:user>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:        <nova:project uuid="876dc65bd147462083d316c47ce85516">tempest-TestExecuteWorkloadBalancingStrategy-989128504</nova:project>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:      </nova:owner>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:      <nova:root type="image" uuid="f4c3125a-6fd0-40bb-aa00-a7e736ee853d"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:      <nova:ports>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:        <nova:port uuid="bf6c2a48-999b-4820-9631-dd3dfff5caac">
Dec  5 08:10:01 np0005546954 nova_compute[187160]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:        </nova:port>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:      </nova:ports>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    </nova:instance>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:  </metadata>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:  <sysinfo type="smbios">
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <system>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:      <entry name="manufacturer">RDO</entry>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:      <entry name="product">OpenStack Compute</entry>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:      <entry name="serial">769e4da5-19b5-4fc2-a50f-79ae93eaac71</entry>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:      <entry name="uuid">769e4da5-19b5-4fc2-a50f-79ae93eaac71</entry>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:      <entry name="family">Virtual Machine</entry>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    </system>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:  </sysinfo>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:  <os>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <boot dev="hd"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <smbios mode="sysinfo"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:  </os>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:  <features>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <acpi/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <apic/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <vmcoreinfo/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:  </features>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:  <clock offset="utc">
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <timer name="hpet" present="no"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:  </clock>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:  <cpu mode="custom" match="exact">
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <model>Nehalem</model>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:  </cpu>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:  <devices>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <disk type="file" device="disk">
Dec  5 08:10:01 np0005546954 nova_compute[187160]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:      <source file="/var/lib/nova/instances/769e4da5-19b5-4fc2-a50f-79ae93eaac71/disk"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:      <target dev="vda" bus="virtio"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    </disk>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <disk type="file" device="cdrom">
Dec  5 08:10:01 np0005546954 nova_compute[187160]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:      <source file="/var/lib/nova/instances/769e4da5-19b5-4fc2-a50f-79ae93eaac71/disk.config"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:      <target dev="sda" bus="sata"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    </disk>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <interface type="ethernet">
Dec  5 08:10:01 np0005546954 nova_compute[187160]:      <mac address="fa:16:3e:e4:68:0e"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:      <model type="virtio"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:      <mtu size="1442"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:      <target dev="tapbf6c2a48-99"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    </interface>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <serial type="pty">
Dec  5 08:10:01 np0005546954 nova_compute[187160]:      <log file="/var/lib/nova/instances/769e4da5-19b5-4fc2-a50f-79ae93eaac71/console.log" append="off"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    </serial>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <video>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:      <model type="virtio"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    </video>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <input type="tablet" bus="usb"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <rng model="virtio">
Dec  5 08:10:01 np0005546954 nova_compute[187160]:      <backend model="random">/dev/urandom</backend>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    </rng>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <controller type="usb" index="0"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    <memballoon model="virtio">
Dec  5 08:10:01 np0005546954 nova_compute[187160]:      <stats period="10"/>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:    </memballoon>
Dec  5 08:10:01 np0005546954 nova_compute[187160]:  </devices>
Dec  5 08:10:01 np0005546954 nova_compute[187160]: </domain>
Dec  5 08:10:01 np0005546954 nova_compute[187160]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.113 187164 DEBUG nova.compute.manager [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Preparing to wait for external event network-vif-plugged-bf6c2a48-999b-4820-9631-dd3dfff5caac prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.113 187164 DEBUG oslo_concurrency.lockutils [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Acquiring lock "769e4da5-19b5-4fc2-a50f-79ae93eaac71-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.114 187164 DEBUG oslo_concurrency.lockutils [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Lock "769e4da5-19b5-4fc2-a50f-79ae93eaac71-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.114 187164 DEBUG oslo_concurrency.lockutils [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Lock "769e4da5-19b5-4fc2-a50f-79ae93eaac71-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.115 187164 DEBUG nova.virt.libvirt.vif [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T13:09:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-1045316080',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-1045316080',id=27,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='876dc65bd147462083d316c47ce85516',ramdisk_id='',reservation_id='r-1t1mv4oh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-989128504',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-989128504-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T13:09:57Z,user_data=None,user_id='4dd51daacb354f619a7413f8b910294c',uuid=769e4da5-19b5-4fc2-a50f-79ae93eaac71,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bf6c2a48-999b-4820-9631-dd3dfff5caac", "address": "fa:16:3e:e4:68:0e", "network": {"id": "b5e62496-3107-4a7c-9c99-52197cb36a02", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2065334195-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "876dc65bd147462083d316c47ce85516", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf6c2a48-99", "ovs_interfaceid": "bf6c2a48-999b-4820-9631-dd3dfff5caac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.115 187164 DEBUG nova.network.os_vif_util [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Converting VIF {"id": "bf6c2a48-999b-4820-9631-dd3dfff5caac", "address": "fa:16:3e:e4:68:0e", "network": {"id": "b5e62496-3107-4a7c-9c99-52197cb36a02", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2065334195-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "876dc65bd147462083d316c47ce85516", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf6c2a48-99", "ovs_interfaceid": "bf6c2a48-999b-4820-9631-dd3dfff5caac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.116 187164 DEBUG nova.network.os_vif_util [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e4:68:0e,bridge_name='br-int',has_traffic_filtering=True,id=bf6c2a48-999b-4820-9631-dd3dfff5caac,network=Network(b5e62496-3107-4a7c-9c99-52197cb36a02),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbf6c2a48-99') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.117 187164 DEBUG os_vif [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e4:68:0e,bridge_name='br-int',has_traffic_filtering=True,id=bf6c2a48-999b-4820-9631-dd3dfff5caac,network=Network(b5e62496-3107-4a7c-9c99-52197cb36a02),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbf6c2a48-99') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.117 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.118 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.118 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.122 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.122 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbf6c2a48-99, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.123 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbf6c2a48-99, col_values=(('external_ids', {'iface-id': 'bf6c2a48-999b-4820-9631-dd3dfff5caac', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e4:68:0e', 'vm-uuid': '769e4da5-19b5-4fc2-a50f-79ae93eaac71'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.124 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.126 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 08:10:01 np0005546954 NetworkManager[55665]: <info>  [1764940201.1263] manager: (tapbf6c2a48-99): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/98)
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.131 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.132 187164 INFO os_vif [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e4:68:0e,bridge_name='br-int',has_traffic_filtering=True,id=bf6c2a48-999b-4820-9631-dd3dfff5caac,network=Network(b5e62496-3107-4a7c-9c99-52197cb36a02),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbf6c2a48-99')#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.182 187164 DEBUG nova.virt.libvirt.driver [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.183 187164 DEBUG nova.virt.libvirt.driver [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.183 187164 DEBUG nova.virt.libvirt.driver [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] No VIF found with MAC fa:16:3e:e4:68:0e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.183 187164 INFO nova.virt.libvirt.driver [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Using config drive#033[00m
Dec  5 08:10:01 np0005546954 nova_compute[187160]: 2025-12-05 13:10:01.387 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:10:02 np0005546954 nova_compute[187160]: 2025-12-05 13:10:02.992 187164 INFO nova.virt.libvirt.driver [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Creating config drive at /var/lib/nova/instances/769e4da5-19b5-4fc2-a50f-79ae93eaac71/disk.config#033[00m
Dec  5 08:10:02 np0005546954 nova_compute[187160]: 2025-12-05 13:10:02.997 187164 DEBUG oslo_concurrency.processutils [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/769e4da5-19b5-4fc2-a50f-79ae93eaac71/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpp1pjmk4q execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:10:03 np0005546954 nova_compute[187160]: 2025-12-05 13:10:03.119 187164 DEBUG oslo_concurrency.processutils [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/769e4da5-19b5-4fc2-a50f-79ae93eaac71/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpp1pjmk4q" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:10:03 np0005546954 kernel: tapbf6c2a48-99: entered promiscuous mode
Dec  5 08:10:03 np0005546954 ovn_controller[95566]: 2025-12-05T13:10:03Z|00255|binding|INFO|Claiming lport bf6c2a48-999b-4820-9631-dd3dfff5caac for this chassis.
Dec  5 08:10:03 np0005546954 ovn_controller[95566]: 2025-12-05T13:10:03Z|00256|binding|INFO|bf6c2a48-999b-4820-9631-dd3dfff5caac: Claiming fa:16:3e:e4:68:0e 10.100.0.9
Dec  5 08:10:03 np0005546954 NetworkManager[55665]: <info>  [1764940203.1989] manager: (tapbf6c2a48-99): new Tun device (/org/freedesktop/NetworkManager/Devices/99)
Dec  5 08:10:03 np0005546954 nova_compute[187160]: 2025-12-05 13:10:03.198 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:10:03 np0005546954 nova_compute[187160]: 2025-12-05 13:10:03.202 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:10:03 np0005546954 nova_compute[187160]: 2025-12-05 13:10:03.206 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:03.217 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e4:68:0e 10.100.0.9'], port_security=['fa:16:3e:e4:68:0e 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '769e4da5-19b5-4fc2-a50f-79ae93eaac71', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5e62496-3107-4a7c-9c99-52197cb36a02', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '876dc65bd147462083d316c47ce85516', 'neutron:revision_number': '2', 'neutron:security_group_ids': '77b331ae-8f43-48fd-990f-4e89bc9baf9e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d7fb8b84-3250-4a16-88c8-3557896a7df7, chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=bf6c2a48-999b-4820-9631-dd3dfff5caac) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:03.219 104428 INFO neutron.agent.ovn.metadata.agent [-] Port bf6c2a48-999b-4820-9631-dd3dfff5caac in datapath b5e62496-3107-4a7c-9c99-52197cb36a02 bound to our chassis#033[00m
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:03.220 104428 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b5e62496-3107-4a7c-9c99-52197cb36a02#033[00m
Dec  5 08:10:03 np0005546954 systemd-machined[153497]: New machine qemu-25-instance-0000001b.
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:03.230 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[ae635393-995a-4359-acb7-19401d324b58]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:03.231 104428 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb5e62496-31 in ovnmeta-b5e62496-3107-4a7c-9c99-52197cb36a02 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:03.233 208690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb5e62496-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:03.233 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[354eac30-e8ee-459e-812e-256d6c6e7473]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:03.234 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[0a402392-b779-4343-909d-dbcf18cd0401]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:03.248 104542 DEBUG oslo.privsep.daemon [-] privsep: reply[43a793f5-0e26-4938-a8a5-a2c242cc2fdf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:10:03 np0005546954 systemd[1]: Started Virtual Machine qemu-25-instance-0000001b.
Dec  5 08:10:03 np0005546954 systemd-udevd[218904]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:03.284 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[bd662834-fe12-4b9b-878b-e2a90bd3514e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:10:03 np0005546954 nova_compute[187160]: 2025-12-05 13:10:03.287 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:10:03 np0005546954 ovn_controller[95566]: 2025-12-05T13:10:03Z|00257|binding|INFO|Setting lport bf6c2a48-999b-4820-9631-dd3dfff5caac ovn-installed in OVS
Dec  5 08:10:03 np0005546954 ovn_controller[95566]: 2025-12-05T13:10:03Z|00258|binding|INFO|Setting lport bf6c2a48-999b-4820-9631-dd3dfff5caac up in Southbound
Dec  5 08:10:03 np0005546954 nova_compute[187160]: 2025-12-05 13:10:03.291 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:10:03 np0005546954 NetworkManager[55665]: <info>  [1764940203.2984] device (tapbf6c2a48-99): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 08:10:03 np0005546954 NetworkManager[55665]: <info>  [1764940203.2994] device (tapbf6c2a48-99): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:03.313 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[580ea1c5-dc68-4d39-a9cb-614a1969f019]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:10:03 np0005546954 systemd-udevd[218912]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 08:10:03 np0005546954 NetworkManager[55665]: <info>  [1764940203.3192] manager: (tapb5e62496-30): new Veth device (/org/freedesktop/NetworkManager/Devices/100)
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:03.320 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[f24ab79d-56c6-47ae-bba7-e6f02c084e92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:03.348 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[a4836b65-79c2-40fe-bf49-263e47da53aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:03.351 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[3873ee3d-1291-45bf-b2c2-7a54dd465d4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:10:03 np0005546954 NetworkManager[55665]: <info>  [1764940203.3715] device (tapb5e62496-30): carrier: link connected
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:03.379 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[15551335-e776-497e-bf02-555853de9ac6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:03.398 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[518524c6-d662-4605-b397-2bf9bef33770]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb5e62496-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8d:ba:67'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 71], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 520999, 'reachable_time': 37955, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218933, 'error': None, 'target': 'ovnmeta-b5e62496-3107-4a7c-9c99-52197cb36a02', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:03.413 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[2d626352-9b3b-4cfa-a237-2e64169807d2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8d:ba67'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 520999, 'tstamp': 520999}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218934, 'error': None, 'target': 'ovnmeta-b5e62496-3107-4a7c-9c99-52197cb36a02', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:03.432 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[099d4d82-2f4a-4dbc-baae-da2a1e1c5008]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb5e62496-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8d:ba:67'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 71], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 520999, 'reachable_time': 37955, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218935, 'error': None, 'target': 'ovnmeta-b5e62496-3107-4a7c-9c99-52197cb36a02', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:03.464 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[1e3861cf-7b13-4da9-b308-aae9ce7f4df5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:03.518 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[f451f3eb-72b8-4f97-b7dd-a954bbadb00c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:03.520 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5e62496-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:03.520 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:03.520 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb5e62496-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:10:03 np0005546954 kernel: tapb5e62496-30: entered promiscuous mode
Dec  5 08:10:03 np0005546954 NetworkManager[55665]: <info>  [1764940203.5231] manager: (tapb5e62496-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/101)
Dec  5 08:10:03 np0005546954 nova_compute[187160]: 2025-12-05 13:10:03.522 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:10:03 np0005546954 nova_compute[187160]: 2025-12-05 13:10:03.524 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:03.525 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb5e62496-30, col_values=(('external_ids', {'iface-id': 'aa019109-35e6-476c-a72a-a2239e6e0fae'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:10:03 np0005546954 ovn_controller[95566]: 2025-12-05T13:10:03Z|00259|binding|INFO|Releasing lport aa019109-35e6-476c-a72a-a2239e6e0fae from this chassis (sb_readonly=0)
Dec  5 08:10:03 np0005546954 nova_compute[187160]: 2025-12-05 13:10:03.526 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:10:03 np0005546954 nova_compute[187160]: 2025-12-05 13:10:03.537 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:03.538 104428 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b5e62496-3107-4a7c-9c99-52197cb36a02.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b5e62496-3107-4a7c-9c99-52197cb36a02.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:03.539 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[116e84a1-ca5e-4572-88cd-830b9c771977]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:03.540 104428 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]: global
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]:    log         /dev/log local0 debug
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]:    log-tag     haproxy-metadata-proxy-b5e62496-3107-4a7c-9c99-52197cb36a02
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]:    user        root
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]:    group       root
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]:    maxconn     1024
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]:    pidfile     /var/lib/neutron/external/pids/b5e62496-3107-4a7c-9c99-52197cb36a02.pid.haproxy
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]:    daemon
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]: 
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]: defaults
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]:    log global
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]:    mode http
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]:    option httplog
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]:    option dontlognull
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]:    option http-server-close
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]:    option forwardfor
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]:    retries                 3
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]:    timeout http-request    30s
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]:    timeout connect         30s
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]:    timeout client          32s
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]:    timeout server          32s
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]:    timeout http-keep-alive 30s
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]: 
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]: 
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]: listen listener
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]:    bind 169.254.169.254:80
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]:    http-request add-header X-OVN-Network-ID b5e62496-3107-4a7c-9c99-52197cb36a02
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 08:10:03 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:03.540 104428 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b5e62496-3107-4a7c-9c99-52197cb36a02', 'env', 'PROCESS_TAG=haproxy-b5e62496-3107-4a7c-9c99-52197cb36a02', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b5e62496-3107-4a7c-9c99-52197cb36a02.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 08:10:03 np0005546954 nova_compute[187160]: 2025-12-05 13:10:03.837 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764940203.8371606, 769e4da5-19b5-4fc2-a50f-79ae93eaac71 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 08:10:03 np0005546954 nova_compute[187160]: 2025-12-05 13:10:03.838 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] VM Started (Lifecycle Event)#033[00m
Dec  5 08:10:03 np0005546954 podman[218975]: 2025-12-05 13:10:03.876621458 +0000 UTC m=+0.050731222 container create c1369fcc9898edb2eac91f527f40bbfbd33979cce93acc06ab4191f6e4b1991c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5e62496-3107-4a7c-9c99-52197cb36a02, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec  5 08:10:03 np0005546954 systemd[1]: Started libpod-conmon-c1369fcc9898edb2eac91f527f40bbfbd33979cce93acc06ab4191f6e4b1991c.scope.
Dec  5 08:10:03 np0005546954 systemd[1]: Started libcrun container.
Dec  5 08:10:03 np0005546954 podman[218975]: 2025-12-05 13:10:03.847178307 +0000 UTC m=+0.021288101 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 08:10:03 np0005546954 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6bf5fc01373642dacad4b299d861a61b75470067c2b7fbfe4361855db2fa0b4d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 08:10:03 np0005546954 podman[218975]: 2025-12-05 13:10:03.9583623 +0000 UTC m=+0.132472084 container init c1369fcc9898edb2eac91f527f40bbfbd33979cce93acc06ab4191f6e4b1991c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5e62496-3107-4a7c-9c99-52197cb36a02, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS)
Dec  5 08:10:03 np0005546954 podman[218975]: 2025-12-05 13:10:03.962951283 +0000 UTC m=+0.137061047 container start c1369fcc9898edb2eac91f527f40bbfbd33979cce93acc06ab4191f6e4b1991c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5e62496-3107-4a7c-9c99-52197cb36a02, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec  5 08:10:03 np0005546954 neutron-haproxy-ovnmeta-b5e62496-3107-4a7c-9c99-52197cb36a02[218990]: [NOTICE]   (218994) : New worker (218996) forked
Dec  5 08:10:03 np0005546954 neutron-haproxy-ovnmeta-b5e62496-3107-4a7c-9c99-52197cb36a02[218990]: [NOTICE]   (218994) : Loading success.
Dec  5 08:10:04 np0005546954 nova_compute[187160]: 2025-12-05 13:10:04.076 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 08:10:04 np0005546954 nova_compute[187160]: 2025-12-05 13:10:04.080 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764940203.837718, 769e4da5-19b5-4fc2-a50f-79ae93eaac71 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 08:10:04 np0005546954 nova_compute[187160]: 2025-12-05 13:10:04.080 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] VM Paused (Lifecycle Event)#033[00m
Dec  5 08:10:04 np0005546954 nova_compute[187160]: 2025-12-05 13:10:04.480 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 08:10:04 np0005546954 nova_compute[187160]: 2025-12-05 13:10:04.483 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 08:10:04 np0005546954 nova_compute[187160]: 2025-12-05 13:10:04.958 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 08:10:05 np0005546954 nova_compute[187160]: 2025-12-05 13:10:05.186 187164 DEBUG nova.compute.manager [req-f4141c4c-eba2-48da-9714-3e3b3afc2695 req-ecb10e48-8da8-4752-b4d2-0c25ed8c27d3 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Received event network-vif-plugged-bf6c2a48-999b-4820-9631-dd3dfff5caac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:10:05 np0005546954 nova_compute[187160]: 2025-12-05 13:10:05.187 187164 DEBUG oslo_concurrency.lockutils [req-f4141c4c-eba2-48da-9714-3e3b3afc2695 req-ecb10e48-8da8-4752-b4d2-0c25ed8c27d3 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "769e4da5-19b5-4fc2-a50f-79ae93eaac71-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:10:05 np0005546954 nova_compute[187160]: 2025-12-05 13:10:05.188 187164 DEBUG oslo_concurrency.lockutils [req-f4141c4c-eba2-48da-9714-3e3b3afc2695 req-ecb10e48-8da8-4752-b4d2-0c25ed8c27d3 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "769e4da5-19b5-4fc2-a50f-79ae93eaac71-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:10:05 np0005546954 nova_compute[187160]: 2025-12-05 13:10:05.188 187164 DEBUG oslo_concurrency.lockutils [req-f4141c4c-eba2-48da-9714-3e3b3afc2695 req-ecb10e48-8da8-4752-b4d2-0c25ed8c27d3 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "769e4da5-19b5-4fc2-a50f-79ae93eaac71-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:10:05 np0005546954 nova_compute[187160]: 2025-12-05 13:10:05.188 187164 DEBUG nova.compute.manager [req-f4141c4c-eba2-48da-9714-3e3b3afc2695 req-ecb10e48-8da8-4752-b4d2-0c25ed8c27d3 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Processing event network-vif-plugged-bf6c2a48-999b-4820-9631-dd3dfff5caac _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 08:10:05 np0005546954 nova_compute[187160]: 2025-12-05 13:10:05.189 187164 DEBUG nova.compute.manager [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 08:10:05 np0005546954 nova_compute[187160]: 2025-12-05 13:10:05.193 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764940205.1934333, 769e4da5-19b5-4fc2-a50f-79ae93eaac71 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 08:10:05 np0005546954 nova_compute[187160]: 2025-12-05 13:10:05.194 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] VM Resumed (Lifecycle Event)#033[00m
Dec  5 08:10:05 np0005546954 nova_compute[187160]: 2025-12-05 13:10:05.195 187164 DEBUG nova.virt.libvirt.driver [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 08:10:05 np0005546954 nova_compute[187160]: 2025-12-05 13:10:05.199 187164 INFO nova.virt.libvirt.driver [-] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Instance spawned successfully.#033[00m
Dec  5 08:10:05 np0005546954 nova_compute[187160]: 2025-12-05 13:10:05.200 187164 DEBUG nova.virt.libvirt.driver [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 08:10:05 np0005546954 nova_compute[187160]: 2025-12-05 13:10:05.225 187164 DEBUG nova.network.neutron [req-21ea5feb-d870-4ffe-b6b4-c4876e081174 req-755ac24a-a7c3-4cb1-96cf-aaffe55316bf 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Updated VIF entry in instance network info cache for port bf6c2a48-999b-4820-9631-dd3dfff5caac. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 08:10:05 np0005546954 nova_compute[187160]: 2025-12-05 13:10:05.226 187164 DEBUG nova.network.neutron [req-21ea5feb-d870-4ffe-b6b4-c4876e081174 req-755ac24a-a7c3-4cb1-96cf-aaffe55316bf 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Updating instance_info_cache with network_info: [{"id": "bf6c2a48-999b-4820-9631-dd3dfff5caac", "address": "fa:16:3e:e4:68:0e", "network": {"id": "b5e62496-3107-4a7c-9c99-52197cb36a02", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2065334195-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "876dc65bd147462083d316c47ce85516", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf6c2a48-99", "ovs_interfaceid": "bf6c2a48-999b-4820-9631-dd3dfff5caac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 08:10:05 np0005546954 nova_compute[187160]: 2025-12-05 13:10:05.307 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 08:10:05 np0005546954 nova_compute[187160]: 2025-12-05 13:10:05.309 187164 DEBUG oslo_concurrency.lockutils [req-21ea5feb-d870-4ffe-b6b4-c4876e081174 req-755ac24a-a7c3-4cb1-96cf-aaffe55316bf 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Releasing lock "refresh_cache-769e4da5-19b5-4fc2-a50f-79ae93eaac71" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 08:10:05 np0005546954 nova_compute[187160]: 2025-12-05 13:10:05.313 187164 DEBUG nova.virt.libvirt.driver [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 08:10:05 np0005546954 nova_compute[187160]: 2025-12-05 13:10:05.314 187164 DEBUG nova.virt.libvirt.driver [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 08:10:05 np0005546954 nova_compute[187160]: 2025-12-05 13:10:05.314 187164 DEBUG nova.virt.libvirt.driver [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 08:10:05 np0005546954 nova_compute[187160]: 2025-12-05 13:10:05.315 187164 DEBUG nova.virt.libvirt.driver [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 08:10:05 np0005546954 nova_compute[187160]: 2025-12-05 13:10:05.316 187164 DEBUG nova.virt.libvirt.driver [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 08:10:05 np0005546954 nova_compute[187160]: 2025-12-05 13:10:05.316 187164 DEBUG nova.virt.libvirt.driver [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 08:10:05 np0005546954 nova_compute[187160]: 2025-12-05 13:10:05.321 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 08:10:05 np0005546954 nova_compute[187160]: 2025-12-05 13:10:05.348 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 08:10:05 np0005546954 nova_compute[187160]: 2025-12-05 13:10:05.419 187164 INFO nova.compute.manager [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Took 7.71 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 08:10:05 np0005546954 nova_compute[187160]: 2025-12-05 13:10:05.420 187164 DEBUG nova.compute.manager [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 08:10:05 np0005546954 nova_compute[187160]: 2025-12-05 13:10:05.496 187164 INFO nova.compute.manager [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Took 8.12 seconds to build instance.#033[00m
Dec  5 08:10:05 np0005546954 nova_compute[187160]: 2025-12-05 13:10:05.511 187164 DEBUG oslo_concurrency.lockutils [None req-6bfcb0a3-c08b-4329-b9cd-fbe1c2e8d2e4 4dd51daacb354f619a7413f8b910294c 876dc65bd147462083d316c47ce85516 - - default default] Lock "769e4da5-19b5-4fc2-a50f-79ae93eaac71" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.188s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:10:05 np0005546954 podman[197513]: time="2025-12-05T13:10:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:10:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:10:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  5 08:10:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:10:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3041 "" "Go-http-client/1.1"
Dec  5 08:10:06 np0005546954 nova_compute[187160]: 2025-12-05 13:10:06.124 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:10:06 np0005546954 nova_compute[187160]: 2025-12-05 13:10:06.389 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:10:07 np0005546954 nova_compute[187160]: 2025-12-05 13:10:07.284 187164 DEBUG nova.compute.manager [req-6cbc0521-3c98-4ab0-aacc-c7b205e961fd req-bbc408fd-e6bc-4a46-9532-95e0cd0e891d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Received event network-vif-plugged-bf6c2a48-999b-4820-9631-dd3dfff5caac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:10:07 np0005546954 nova_compute[187160]: 2025-12-05 13:10:07.285 187164 DEBUG oslo_concurrency.lockutils [req-6cbc0521-3c98-4ab0-aacc-c7b205e961fd req-bbc408fd-e6bc-4a46-9532-95e0cd0e891d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "769e4da5-19b5-4fc2-a50f-79ae93eaac71-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:10:07 np0005546954 nova_compute[187160]: 2025-12-05 13:10:07.285 187164 DEBUG oslo_concurrency.lockutils [req-6cbc0521-3c98-4ab0-aacc-c7b205e961fd req-bbc408fd-e6bc-4a46-9532-95e0cd0e891d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "769e4da5-19b5-4fc2-a50f-79ae93eaac71-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:10:07 np0005546954 nova_compute[187160]: 2025-12-05 13:10:07.286 187164 DEBUG oslo_concurrency.lockutils [req-6cbc0521-3c98-4ab0-aacc-c7b205e961fd req-bbc408fd-e6bc-4a46-9532-95e0cd0e891d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "769e4da5-19b5-4fc2-a50f-79ae93eaac71-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:10:07 np0005546954 nova_compute[187160]: 2025-12-05 13:10:07.286 187164 DEBUG nova.compute.manager [req-6cbc0521-3c98-4ab0-aacc-c7b205e961fd req-bbc408fd-e6bc-4a46-9532-95e0cd0e891d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] No waiting events found dispatching network-vif-plugged-bf6c2a48-999b-4820-9631-dd3dfff5caac pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 08:10:07 np0005546954 nova_compute[187160]: 2025-12-05 13:10:07.287 187164 WARNING nova.compute.manager [req-6cbc0521-3c98-4ab0-aacc-c7b205e961fd req-bbc408fd-e6bc-4a46-9532-95e0cd0e891d 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Received unexpected event network-vif-plugged-bf6c2a48-999b-4820-9631-dd3dfff5caac for instance with vm_state active and task_state None.#033[00m
Dec  5 08:10:09 np0005546954 podman[219006]: 2025-12-05 13:10:09.580747559 +0000 UTC m=+0.079087751 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, config_id=multipathd)
Dec  5 08:10:09 np0005546954 podman[219005]: 2025-12-05 13:10:09.592846044 +0000 UTC m=+0.090988879 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., version=9.6, config_id=edpm, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec  5 08:10:11 np0005546954 nova_compute[187160]: 2025-12-05 13:10:11.127 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:10:11 np0005546954 nova_compute[187160]: 2025-12-05 13:10:11.392 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:10:16 np0005546954 nova_compute[187160]: 2025-12-05 13:10:16.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:10:16 np0005546954 nova_compute[187160]: 2025-12-05 13:10:16.039 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 08:10:16 np0005546954 nova_compute[187160]: 2025-12-05 13:10:16.039 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 08:10:16 np0005546954 nova_compute[187160]: 2025-12-05 13:10:16.130 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:10:16 np0005546954 nova_compute[187160]: 2025-12-05 13:10:16.391 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:10:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:16.975 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:10:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:16.975 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:10:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:16.976 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:10:17 np0005546954 nova_compute[187160]: 2025-12-05 13:10:17.047 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "refresh_cache-769e4da5-19b5-4fc2-a50f-79ae93eaac71" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 08:10:17 np0005546954 nova_compute[187160]: 2025-12-05 13:10:17.047 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquired lock "refresh_cache-769e4da5-19b5-4fc2-a50f-79ae93eaac71" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 08:10:17 np0005546954 nova_compute[187160]: 2025-12-05 13:10:17.047 187164 DEBUG nova.network.neutron [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec  5 08:10:17 np0005546954 nova_compute[187160]: 2025-12-05 13:10:17.047 187164 DEBUG nova.objects.instance [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 769e4da5-19b5-4fc2-a50f-79ae93eaac71 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 08:10:18 np0005546954 ovn_controller[95566]: 2025-12-05T13:10:18Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e4:68:0e 10.100.0.9
Dec  5 08:10:18 np0005546954 ovn_controller[95566]: 2025-12-05T13:10:18Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e4:68:0e 10.100.0.9
Dec  5 08:10:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:10:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:10:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:10:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:10:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:10:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:10:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:10:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:10:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:10:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:10:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:10:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:10:21 np0005546954 nova_compute[187160]: 2025-12-05 13:10:21.131 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:10:21 np0005546954 nova_compute[187160]: 2025-12-05 13:10:21.394 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:10:22 np0005546954 nova_compute[187160]: 2025-12-05 13:10:22.202 187164 DEBUG nova.network.neutron [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Updating instance_info_cache with network_info: [{"id": "bf6c2a48-999b-4820-9631-dd3dfff5caac", "address": "fa:16:3e:e4:68:0e", "network": {"id": "b5e62496-3107-4a7c-9c99-52197cb36a02", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2065334195-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "876dc65bd147462083d316c47ce85516", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf6c2a48-99", "ovs_interfaceid": "bf6c2a48-999b-4820-9631-dd3dfff5caac", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 08:10:22 np0005546954 nova_compute[187160]: 2025-12-05 13:10:22.840 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Releasing lock "refresh_cache-769e4da5-19b5-4fc2-a50f-79ae93eaac71" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 08:10:22 np0005546954 nova_compute[187160]: 2025-12-05 13:10:22.841 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec  5 08:10:22 np0005546954 nova_compute[187160]: 2025-12-05 13:10:22.841 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:10:22 np0005546954 nova_compute[187160]: 2025-12-05 13:10:22.841 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:10:22 np0005546954 nova_compute[187160]: 2025-12-05 13:10:22.842 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:10:22 np0005546954 nova_compute[187160]: 2025-12-05 13:10:22.842 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:10:22 np0005546954 nova_compute[187160]: 2025-12-05 13:10:22.842 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:10:22 np0005546954 nova_compute[187160]: 2025-12-05 13:10:22.947 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:10:22 np0005546954 nova_compute[187160]: 2025-12-05 13:10:22.948 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:10:22 np0005546954 nova_compute[187160]: 2025-12-05 13:10:22.949 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:10:22 np0005546954 nova_compute[187160]: 2025-12-05 13:10:22.949 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 08:10:23 np0005546954 podman[219061]: 2025-12-05 13:10:23.136704689 +0000 UTC m=+0.117563893 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent)
Dec  5 08:10:23 np0005546954 nova_compute[187160]: 2025-12-05 13:10:23.450 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/769e4da5-19b5-4fc2-a50f-79ae93eaac71/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:10:23 np0005546954 nova_compute[187160]: 2025-12-05 13:10:23.543 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/769e4da5-19b5-4fc2-a50f-79ae93eaac71/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:10:23 np0005546954 nova_compute[187160]: 2025-12-05 13:10:23.544 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/769e4da5-19b5-4fc2-a50f-79ae93eaac71/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:10:23 np0005546954 nova_compute[187160]: 2025-12-05 13:10:23.631 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/769e4da5-19b5-4fc2-a50f-79ae93eaac71/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:10:23 np0005546954 nova_compute[187160]: 2025-12-05 13:10:23.852 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 08:10:23 np0005546954 nova_compute[187160]: 2025-12-05 13:10:23.854 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5662MB free_disk=73.3007583618164GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 08:10:23 np0005546954 nova_compute[187160]: 2025-12-05 13:10:23.854 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:10:23 np0005546954 nova_compute[187160]: 2025-12-05 13:10:23.855 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:10:24 np0005546954 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec  5 08:10:25 np0005546954 nova_compute[187160]: 2025-12-05 13:10:25.351 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Instance 769e4da5-19b5-4fc2-a50f-79ae93eaac71 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 08:10:25 np0005546954 nova_compute[187160]: 2025-12-05 13:10:25.352 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 08:10:25 np0005546954 nova_compute[187160]: 2025-12-05 13:10:25.353 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 08:10:25 np0005546954 nova_compute[187160]: 2025-12-05 13:10:25.488 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 08:10:26 np0005546954 nova_compute[187160]: 2025-12-05 13:10:26.135 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:10:26 np0005546954 nova_compute[187160]: 2025-12-05 13:10:26.214 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 08:10:26 np0005546954 nova_compute[187160]: 2025-12-05 13:10:26.319 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 08:10:26 np0005546954 nova_compute[187160]: 2025-12-05 13:10:26.320 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.465s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:10:26 np0005546954 nova_compute[187160]: 2025-12-05 13:10:26.321 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:10:26 np0005546954 nova_compute[187160]: 2025-12-05 13:10:26.322 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  5 08:10:26 np0005546954 nova_compute[187160]: 2025-12-05 13:10:26.364 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  5 08:10:26 np0005546954 nova_compute[187160]: 2025-12-05 13:10:26.364 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:10:26 np0005546954 nova_compute[187160]: 2025-12-05 13:10:26.396 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:10:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:27.557 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2a:56:46', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '32:90:88:ab:74:32'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 08:10:27 np0005546954 nova_compute[187160]: 2025-12-05 13:10:27.558 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:10:27 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:27.559 104428 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 08:10:29 np0005546954 podman[219090]: 2025-12-05 13:10:29.557245117 +0000 UTC m=+0.064218636 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 08:10:29 np0005546954 podman[219089]: 2025-12-05 13:10:29.573944436 +0000 UTC m=+0.090193202 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  5 08:10:29 np0005546954 nova_compute[187160]: 2025-12-05 13:10:29.667 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:10:29 np0005546954 nova_compute[187160]: 2025-12-05 13:10:29.668 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:10:29 np0005546954 nova_compute[187160]: 2025-12-05 13:10:29.668 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:10:29 np0005546954 nova_compute[187160]: 2025-12-05 13:10:29.668 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 08:10:31 np0005546954 nova_compute[187160]: 2025-12-05 13:10:31.138 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:10:31 np0005546954 nova_compute[187160]: 2025-12-05 13:10:31.398 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:10:32 np0005546954 nova_compute[187160]: 2025-12-05 13:10:32.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:10:32 np0005546954 nova_compute[187160]: 2025-12-05 13:10:32.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  5 08:10:35 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:35.562 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f9f74c-08f9-451f-9678-93bb9e8fa2fe, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:10:35 np0005546954 podman[197513]: time="2025-12-05T13:10:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:10:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:10:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  5 08:10:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:10:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3058 "" "Go-http-client/1.1"
Dec  5 08:10:36 np0005546954 nova_compute[187160]: 2025-12-05 13:10:36.141 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:10:36 np0005546954 nova_compute[187160]: 2025-12-05 13:10:36.400 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:10:39 np0005546954 nova_compute[187160]: 2025-12-05 13:10:39.276 187164 DEBUG nova.virt.libvirt.driver [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Check if temp file /var/lib/nova/instances/tmpc1lurrm1 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Dec  5 08:10:39 np0005546954 nova_compute[187160]: 2025-12-05 13:10:39.277 187164 DEBUG nova.compute.manager [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpc1lurrm1',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='769e4da5-19b5-4fc2-a50f-79ae93eaac71',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Dec  5 08:10:40 np0005546954 podman[219140]: 2025-12-05 13:10:40.537967853 +0000 UTC m=+0.052553853 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  5 08:10:40 np0005546954 podman[219139]: 2025-12-05 13:10:40.537919852 +0000 UTC m=+0.055643470 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, name=ubi9-minimal, release=1755695350, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.expose-services=, container_name=openstack_network_exporter, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec  5 08:10:40 np0005546954 nova_compute[187160]: 2025-12-05 13:10:40.663 187164 DEBUG oslo_concurrency.processutils [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/769e4da5-19b5-4fc2-a50f-79ae93eaac71/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:10:40 np0005546954 nova_compute[187160]: 2025-12-05 13:10:40.713 187164 DEBUG oslo_concurrency.processutils [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/769e4da5-19b5-4fc2-a50f-79ae93eaac71/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:10:40 np0005546954 nova_compute[187160]: 2025-12-05 13:10:40.714 187164 DEBUG oslo_concurrency.processutils [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/769e4da5-19b5-4fc2-a50f-79ae93eaac71/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:10:40 np0005546954 nova_compute[187160]: 2025-12-05 13:10:40.768 187164 DEBUG oslo_concurrency.processutils [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/769e4da5-19b5-4fc2-a50f-79ae93eaac71/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:10:41 np0005546954 nova_compute[187160]: 2025-12-05 13:10:41.143 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:10:41 np0005546954 nova_compute[187160]: 2025-12-05 13:10:41.402 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:10:42 np0005546954 systemd-logind[789]: New session 36 of user nova.
Dec  5 08:10:42 np0005546954 systemd[1]: Created slice User Slice of UID 42436.
Dec  5 08:10:42 np0005546954 systemd[1]: Starting User Runtime Directory /run/user/42436...
Dec  5 08:10:42 np0005546954 systemd[1]: Finished User Runtime Directory /run/user/42436.
Dec  5 08:10:42 np0005546954 systemd[1]: Starting User Manager for UID 42436...
Dec  5 08:10:43 np0005546954 systemd[219190]: Queued start job for default target Main User Target.
Dec  5 08:10:43 np0005546954 systemd[219190]: Created slice User Application Slice.
Dec  5 08:10:43 np0005546954 systemd[219190]: Started Mark boot as successful after the user session has run 2 minutes.
Dec  5 08:10:43 np0005546954 systemd[219190]: Started Daily Cleanup of User's Temporary Directories.
Dec  5 08:10:43 np0005546954 systemd[219190]: Reached target Paths.
Dec  5 08:10:43 np0005546954 systemd[219190]: Reached target Timers.
Dec  5 08:10:43 np0005546954 systemd[219190]: Starting D-Bus User Message Bus Socket...
Dec  5 08:10:43 np0005546954 systemd[219190]: Starting Create User's Volatile Files and Directories...
Dec  5 08:10:43 np0005546954 systemd[219190]: Finished Create User's Volatile Files and Directories.
Dec  5 08:10:43 np0005546954 systemd[219190]: Listening on D-Bus User Message Bus Socket.
Dec  5 08:10:43 np0005546954 systemd[219190]: Reached target Sockets.
Dec  5 08:10:43 np0005546954 systemd[219190]: Reached target Basic System.
Dec  5 08:10:43 np0005546954 systemd[219190]: Reached target Main User Target.
Dec  5 08:10:43 np0005546954 systemd[219190]: Startup finished in 147ms.
Dec  5 08:10:43 np0005546954 systemd[1]: Started User Manager for UID 42436.
Dec  5 08:10:43 np0005546954 systemd[1]: Started Session 36 of User nova.
Dec  5 08:10:43 np0005546954 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  5 08:10:43 np0005546954 systemd[1]: session-36.scope: Deactivated successfully.
Dec  5 08:10:43 np0005546954 systemd-logind[789]: Session 36 logged out. Waiting for processes to exit.
Dec  5 08:10:43 np0005546954 systemd-logind[789]: Removed session 36.
Dec  5 08:10:43 np0005546954 nova_compute[187160]: 2025-12-05 13:10:43.660 187164 DEBUG nova.compute.manager [req-d3359379-6d6a-46de-b73c-77d774ba80ed req-e48f55db-eee6-4500-a1b3-3d95b37d3aa6 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Received event network-vif-unplugged-bf6c2a48-999b-4820-9631-dd3dfff5caac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:10:43 np0005546954 nova_compute[187160]: 2025-12-05 13:10:43.660 187164 DEBUG oslo_concurrency.lockutils [req-d3359379-6d6a-46de-b73c-77d774ba80ed req-e48f55db-eee6-4500-a1b3-3d95b37d3aa6 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "769e4da5-19b5-4fc2-a50f-79ae93eaac71-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:10:43 np0005546954 nova_compute[187160]: 2025-12-05 13:10:43.660 187164 DEBUG oslo_concurrency.lockutils [req-d3359379-6d6a-46de-b73c-77d774ba80ed req-e48f55db-eee6-4500-a1b3-3d95b37d3aa6 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "769e4da5-19b5-4fc2-a50f-79ae93eaac71-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:10:43 np0005546954 nova_compute[187160]: 2025-12-05 13:10:43.661 187164 DEBUG oslo_concurrency.lockutils [req-d3359379-6d6a-46de-b73c-77d774ba80ed req-e48f55db-eee6-4500-a1b3-3d95b37d3aa6 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "769e4da5-19b5-4fc2-a50f-79ae93eaac71-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:10:43 np0005546954 nova_compute[187160]: 2025-12-05 13:10:43.661 187164 DEBUG nova.compute.manager [req-d3359379-6d6a-46de-b73c-77d774ba80ed req-e48f55db-eee6-4500-a1b3-3d95b37d3aa6 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] No waiting events found dispatching network-vif-unplugged-bf6c2a48-999b-4820-9631-dd3dfff5caac pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 08:10:43 np0005546954 nova_compute[187160]: 2025-12-05 13:10:43.661 187164 DEBUG nova.compute.manager [req-d3359379-6d6a-46de-b73c-77d774ba80ed req-e48f55db-eee6-4500-a1b3-3d95b37d3aa6 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Received event network-vif-unplugged-bf6c2a48-999b-4820-9631-dd3dfff5caac for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  5 08:10:44 np0005546954 nova_compute[187160]: 2025-12-05 13:10:44.612 187164 INFO nova.compute.manager [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Took 3.84 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.#033[00m
Dec  5 08:10:44 np0005546954 nova_compute[187160]: 2025-12-05 13:10:44.613 187164 DEBUG nova.compute.manager [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 08:10:44 np0005546954 nova_compute[187160]: 2025-12-05 13:10:44.633 187164 DEBUG nova.compute.manager [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpc1lurrm1',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='769e4da5-19b5-4fc2-a50f-79ae93eaac71',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(9189c7d2-a0ea-4906-b593-db7025a8d440),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Dec  5 08:10:44 np0005546954 nova_compute[187160]: 2025-12-05 13:10:44.656 187164 DEBUG nova.objects.instance [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Lazy-loading 'migration_context' on Instance uuid 769e4da5-19b5-4fc2-a50f-79ae93eaac71 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 08:10:44 np0005546954 nova_compute[187160]: 2025-12-05 13:10:44.657 187164 DEBUG nova.virt.libvirt.driver [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Dec  5 08:10:44 np0005546954 nova_compute[187160]: 2025-12-05 13:10:44.659 187164 DEBUG nova.virt.libvirt.driver [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Dec  5 08:10:44 np0005546954 nova_compute[187160]: 2025-12-05 13:10:44.659 187164 DEBUG nova.virt.libvirt.driver [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Dec  5 08:10:44 np0005546954 nova_compute[187160]: 2025-12-05 13:10:44.672 187164 DEBUG nova.virt.libvirt.vif [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T13:09:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-1045316080',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-1045316080',id=27,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T13:10:05Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='876dc65bd147462083d316c47ce85516',ramdisk_id='',reservation_id='r-1t1mv4oh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-989128504',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-989128504-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T13:10:05Z,user_data=None,user_id='4dd51daacb354f619a7413f8b910294c',uuid=769e4da5-19b5-4fc2-a50f-79ae93eaac71,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bf6c2a48-999b-4820-9631-dd3dfff5caac", "address": "fa:16:3e:e4:68:0e", "network": {"id": "b5e62496-3107-4a7c-9c99-52197cb36a02", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2065334195-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "876dc65bd147462083d316c47ce85516", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapbf6c2a48-99", "ovs_interfaceid": "bf6c2a48-999b-4820-9631-dd3dfff5caac", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 08:10:44 np0005546954 nova_compute[187160]: 2025-12-05 13:10:44.673 187164 DEBUG nova.network.os_vif_util [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Converting VIF {"id": "bf6c2a48-999b-4820-9631-dd3dfff5caac", "address": "fa:16:3e:e4:68:0e", "network": {"id": "b5e62496-3107-4a7c-9c99-52197cb36a02", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2065334195-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "876dc65bd147462083d316c47ce85516", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapbf6c2a48-99", "ovs_interfaceid": "bf6c2a48-999b-4820-9631-dd3dfff5caac", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 08:10:44 np0005546954 nova_compute[187160]: 2025-12-05 13:10:44.673 187164 DEBUG nova.network.os_vif_util [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e4:68:0e,bridge_name='br-int',has_traffic_filtering=True,id=bf6c2a48-999b-4820-9631-dd3dfff5caac,network=Network(b5e62496-3107-4a7c-9c99-52197cb36a02),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbf6c2a48-99') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 08:10:44 np0005546954 nova_compute[187160]: 2025-12-05 13:10:44.674 187164 DEBUG nova.virt.libvirt.migration [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Updating guest XML with vif config: <interface type="ethernet">
Dec  5 08:10:44 np0005546954 nova_compute[187160]:  <mac address="fa:16:3e:e4:68:0e"/>
Dec  5 08:10:44 np0005546954 nova_compute[187160]:  <model type="virtio"/>
Dec  5 08:10:44 np0005546954 nova_compute[187160]:  <driver name="vhost" rx_queue_size="512"/>
Dec  5 08:10:44 np0005546954 nova_compute[187160]:  <mtu size="1442"/>
Dec  5 08:10:44 np0005546954 nova_compute[187160]:  <target dev="tapbf6c2a48-99"/>
Dec  5 08:10:44 np0005546954 nova_compute[187160]: </interface>
Dec  5 08:10:44 np0005546954 nova_compute[187160]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Dec  5 08:10:44 np0005546954 nova_compute[187160]: 2025-12-05 13:10:44.674 187164 DEBUG nova.virt.libvirt.driver [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Dec  5 08:10:45 np0005546954 nova_compute[187160]: 2025-12-05 13:10:45.161 187164 DEBUG nova.virt.libvirt.migration [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Dec  5 08:10:45 np0005546954 nova_compute[187160]: 2025-12-05 13:10:45.162 187164 INFO nova.virt.libvirt.migration [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Dec  5 08:10:45 np0005546954 nova_compute[187160]: 2025-12-05 13:10:45.263 187164 INFO nova.virt.libvirt.driver [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Dec  5 08:10:45 np0005546954 nova_compute[187160]: 2025-12-05 13:10:45.767 187164 DEBUG nova.compute.manager [req-c0cf5c07-d288-4006-8e01-6af740ae04cb req-d259ee89-55c7-45d7-a497-8a0327270bd8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Received event network-vif-plugged-bf6c2a48-999b-4820-9631-dd3dfff5caac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:10:45 np0005546954 nova_compute[187160]: 2025-12-05 13:10:45.767 187164 DEBUG oslo_concurrency.lockutils [req-c0cf5c07-d288-4006-8e01-6af740ae04cb req-d259ee89-55c7-45d7-a497-8a0327270bd8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "769e4da5-19b5-4fc2-a50f-79ae93eaac71-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:10:45 np0005546954 nova_compute[187160]: 2025-12-05 13:10:45.767 187164 DEBUG oslo_concurrency.lockutils [req-c0cf5c07-d288-4006-8e01-6af740ae04cb req-d259ee89-55c7-45d7-a497-8a0327270bd8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "769e4da5-19b5-4fc2-a50f-79ae93eaac71-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:10:45 np0005546954 nova_compute[187160]: 2025-12-05 13:10:45.768 187164 DEBUG oslo_concurrency.lockutils [req-c0cf5c07-d288-4006-8e01-6af740ae04cb req-d259ee89-55c7-45d7-a497-8a0327270bd8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "769e4da5-19b5-4fc2-a50f-79ae93eaac71-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:10:45 np0005546954 nova_compute[187160]: 2025-12-05 13:10:45.768 187164 DEBUG nova.compute.manager [req-c0cf5c07-d288-4006-8e01-6af740ae04cb req-d259ee89-55c7-45d7-a497-8a0327270bd8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] No waiting events found dispatching network-vif-plugged-bf6c2a48-999b-4820-9631-dd3dfff5caac pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 08:10:45 np0005546954 nova_compute[187160]: 2025-12-05 13:10:45.768 187164 WARNING nova.compute.manager [req-c0cf5c07-d288-4006-8e01-6af740ae04cb req-d259ee89-55c7-45d7-a497-8a0327270bd8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Received unexpected event network-vif-plugged-bf6c2a48-999b-4820-9631-dd3dfff5caac for instance with vm_state active and task_state migrating.#033[00m
Dec  5 08:10:45 np0005546954 nova_compute[187160]: 2025-12-05 13:10:45.769 187164 DEBUG nova.compute.manager [req-c0cf5c07-d288-4006-8e01-6af740ae04cb req-d259ee89-55c7-45d7-a497-8a0327270bd8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Received event network-changed-bf6c2a48-999b-4820-9631-dd3dfff5caac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:10:45 np0005546954 nova_compute[187160]: 2025-12-05 13:10:45.769 187164 DEBUG nova.compute.manager [req-c0cf5c07-d288-4006-8e01-6af740ae04cb req-d259ee89-55c7-45d7-a497-8a0327270bd8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Refreshing instance network info cache due to event network-changed-bf6c2a48-999b-4820-9631-dd3dfff5caac. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 08:10:45 np0005546954 nova_compute[187160]: 2025-12-05 13:10:45.769 187164 DEBUG oslo_concurrency.lockutils [req-c0cf5c07-d288-4006-8e01-6af740ae04cb req-d259ee89-55c7-45d7-a497-8a0327270bd8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "refresh_cache-769e4da5-19b5-4fc2-a50f-79ae93eaac71" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 08:10:45 np0005546954 nova_compute[187160]: 2025-12-05 13:10:45.770 187164 DEBUG oslo_concurrency.lockutils [req-c0cf5c07-d288-4006-8e01-6af740ae04cb req-d259ee89-55c7-45d7-a497-8a0327270bd8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquired lock "refresh_cache-769e4da5-19b5-4fc2-a50f-79ae93eaac71" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 08:10:45 np0005546954 nova_compute[187160]: 2025-12-05 13:10:45.770 187164 DEBUG nova.network.neutron [req-c0cf5c07-d288-4006-8e01-6af740ae04cb req-d259ee89-55c7-45d7-a497-8a0327270bd8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Refreshing network info cache for port bf6c2a48-999b-4820-9631-dd3dfff5caac _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 08:10:45 np0005546954 nova_compute[187160]: 2025-12-05 13:10:45.772 187164 DEBUG nova.virt.libvirt.migration [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Dec  5 08:10:45 np0005546954 nova_compute[187160]: 2025-12-05 13:10:45.772 187164 DEBUG nova.virt.libvirt.migration [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Dec  5 08:10:46 np0005546954 nova_compute[187160]: 2025-12-05 13:10:46.146 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:10:46 np0005546954 nova_compute[187160]: 2025-12-05 13:10:46.276 187164 DEBUG nova.virt.libvirt.migration [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Dec  5 08:10:46 np0005546954 nova_compute[187160]: 2025-12-05 13:10:46.276 187164 DEBUG nova.virt.libvirt.migration [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Dec  5 08:10:46 np0005546954 nova_compute[187160]: 2025-12-05 13:10:46.404 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:10:46 np0005546954 nova_compute[187160]: 2025-12-05 13:10:46.788 187164 DEBUG nova.virt.libvirt.migration [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Dec  5 08:10:46 np0005546954 nova_compute[187160]: 2025-12-05 13:10:46.789 187164 DEBUG nova.virt.libvirt.migration [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Dec  5 08:10:46 np0005546954 nova_compute[187160]: 2025-12-05 13:10:46.789 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764940246.787084, 769e4da5-19b5-4fc2-a50f-79ae93eaac71 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 08:10:46 np0005546954 nova_compute[187160]: 2025-12-05 13:10:46.789 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] VM Paused (Lifecycle Event)#033[00m
Dec  5 08:10:46 np0005546954 nova_compute[187160]: 2025-12-05 13:10:46.806 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 08:10:47 np0005546954 nova_compute[187160]: 2025-12-05 13:10:47.019 187164 DEBUG nova.network.neutron [req-c0cf5c07-d288-4006-8e01-6af740ae04cb req-d259ee89-55c7-45d7-a497-8a0327270bd8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Updated VIF entry in instance network info cache for port bf6c2a48-999b-4820-9631-dd3dfff5caac. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 08:10:47 np0005546954 nova_compute[187160]: 2025-12-05 13:10:47.019 187164 DEBUG nova.network.neutron [req-c0cf5c07-d288-4006-8e01-6af740ae04cb req-d259ee89-55c7-45d7-a497-8a0327270bd8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Updating instance_info_cache with network_info: [{"id": "bf6c2a48-999b-4820-9631-dd3dfff5caac", "address": "fa:16:3e:e4:68:0e", "network": {"id": "b5e62496-3107-4a7c-9c99-52197cb36a02", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2065334195-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "876dc65bd147462083d316c47ce85516", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf6c2a48-99", "ovs_interfaceid": "bf6c2a48-999b-4820-9631-dd3dfff5caac", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 08:10:47 np0005546954 nova_compute[187160]: 2025-12-05 13:10:47.043 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 08:10:47 np0005546954 nova_compute[187160]: 2025-12-05 13:10:47.046 187164 DEBUG oslo_concurrency.lockutils [req-c0cf5c07-d288-4006-8e01-6af740ae04cb req-d259ee89-55c7-45d7-a497-8a0327270bd8 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Releasing lock "refresh_cache-769e4da5-19b5-4fc2-a50f-79ae93eaac71" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 08:10:47 np0005546954 nova_compute[187160]: 2025-12-05 13:10:47.064 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Dec  5 08:10:47 np0005546954 nova_compute[187160]: 2025-12-05 13:10:47.293 187164 DEBUG nova.virt.libvirt.migration [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Dec  5 08:10:47 np0005546954 nova_compute[187160]: 2025-12-05 13:10:47.293 187164 DEBUG nova.virt.libvirt.migration [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Dec  5 08:10:47 np0005546954 kernel: tapbf6c2a48-99 (unregistering): left promiscuous mode
Dec  5 08:10:47 np0005546954 NetworkManager[55665]: <info>  [1764940247.3388] device (tapbf6c2a48-99): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 08:10:47 np0005546954 nova_compute[187160]: 2025-12-05 13:10:47.351 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:10:47 np0005546954 ovn_controller[95566]: 2025-12-05T13:10:47Z|00260|binding|INFO|Releasing lport bf6c2a48-999b-4820-9631-dd3dfff5caac from this chassis (sb_readonly=0)
Dec  5 08:10:47 np0005546954 ovn_controller[95566]: 2025-12-05T13:10:47Z|00261|binding|INFO|Setting lport bf6c2a48-999b-4820-9631-dd3dfff5caac down in Southbound
Dec  5 08:10:47 np0005546954 ovn_controller[95566]: 2025-12-05T13:10:47Z|00262|binding|INFO|Removing iface tapbf6c2a48-99 ovn-installed in OVS
Dec  5 08:10:47 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:47.384 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e4:68:0e 10.100.0.9'], port_security=['fa:16:3e:e4:68:0e 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'b049cde7-59fd-4961-9791-d49d79184b2c'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '769e4da5-19b5-4fc2-a50f-79ae93eaac71', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5e62496-3107-4a7c-9c99-52197cb36a02', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '876dc65bd147462083d316c47ce85516', 'neutron:revision_number': '8', 'neutron:security_group_ids': '77b331ae-8f43-48fd-990f-4e89bc9baf9e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d7fb8b84-3250-4a16-88c8-3557896a7df7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=bf6c2a48-999b-4820-9631-dd3dfff5caac) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 08:10:47 np0005546954 nova_compute[187160]: 2025-12-05 13:10:47.385 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:10:47 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:47.386 104428 INFO neutron.agent.ovn.metadata.agent [-] Port bf6c2a48-999b-4820-9631-dd3dfff5caac in datapath b5e62496-3107-4a7c-9c99-52197cb36a02 unbound from our chassis#033[00m
Dec  5 08:10:47 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:47.387 104428 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b5e62496-3107-4a7c-9c99-52197cb36a02, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 08:10:47 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:47.389 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[a5c9d965-d781-4e18-a415-83b1f1b52527]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:10:47 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:47.389 104428 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b5e62496-3107-4a7c-9c99-52197cb36a02 namespace which is not needed anymore#033[00m
Dec  5 08:10:47 np0005546954 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d0000001b.scope: Deactivated successfully.
Dec  5 08:10:47 np0005546954 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d0000001b.scope: Consumed 14.379s CPU time.
Dec  5 08:10:47 np0005546954 systemd-machined[153497]: Machine qemu-25-instance-0000001b terminated.
Dec  5 08:10:47 np0005546954 nova_compute[187160]: 2025-12-05 13:10:47.579 187164 DEBUG nova.virt.libvirt.driver [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Dec  5 08:10:47 np0005546954 nova_compute[187160]: 2025-12-05 13:10:47.579 187164 DEBUG nova.virt.libvirt.driver [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Dec  5 08:10:47 np0005546954 nova_compute[187160]: 2025-12-05 13:10:47.579 187164 DEBUG nova.virt.libvirt.driver [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Dec  5 08:10:47 np0005546954 neutron-haproxy-ovnmeta-b5e62496-3107-4a7c-9c99-52197cb36a02[218990]: [NOTICE]   (218994) : haproxy version is 2.8.14-c23fe91
Dec  5 08:10:47 np0005546954 neutron-haproxy-ovnmeta-b5e62496-3107-4a7c-9c99-52197cb36a02[218990]: [NOTICE]   (218994) : path to executable is /usr/sbin/haproxy
Dec  5 08:10:47 np0005546954 neutron-haproxy-ovnmeta-b5e62496-3107-4a7c-9c99-52197cb36a02[218990]: [WARNING]  (218994) : Exiting Master process...
Dec  5 08:10:47 np0005546954 neutron-haproxy-ovnmeta-b5e62496-3107-4a7c-9c99-52197cb36a02[218990]: [ALERT]    (218994) : Current worker (218996) exited with code 143 (Terminated)
Dec  5 08:10:47 np0005546954 neutron-haproxy-ovnmeta-b5e62496-3107-4a7c-9c99-52197cb36a02[218990]: [WARNING]  (218994) : All workers exited. Exiting... (0)
Dec  5 08:10:47 np0005546954 systemd[1]: libpod-c1369fcc9898edb2eac91f527f40bbfbd33979cce93acc06ab4191f6e4b1991c.scope: Deactivated successfully.
Dec  5 08:10:47 np0005546954 podman[219241]: 2025-12-05 13:10:47.633366797 +0000 UTC m=+0.124434877 container died c1369fcc9898edb2eac91f527f40bbfbd33979cce93acc06ab4191f6e4b1991c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5e62496-3107-4a7c-9c99-52197cb36a02, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  5 08:10:47 np0005546954 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c1369fcc9898edb2eac91f527f40bbfbd33979cce93acc06ab4191f6e4b1991c-userdata-shm.mount: Deactivated successfully.
Dec  5 08:10:47 np0005546954 systemd[1]: var-lib-containers-storage-overlay-6bf5fc01373642dacad4b299d861a61b75470067c2b7fbfe4361855db2fa0b4d-merged.mount: Deactivated successfully.
Dec  5 08:10:47 np0005546954 podman[219241]: 2025-12-05 13:10:47.680013745 +0000 UTC m=+0.171081815 container cleanup c1369fcc9898edb2eac91f527f40bbfbd33979cce93acc06ab4191f6e4b1991c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5e62496-3107-4a7c-9c99-52197cb36a02, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  5 08:10:47 np0005546954 systemd[1]: libpod-conmon-c1369fcc9898edb2eac91f527f40bbfbd33979cce93acc06ab4191f6e4b1991c.scope: Deactivated successfully.
Dec  5 08:10:47 np0005546954 podman[219288]: 2025-12-05 13:10:47.76223915 +0000 UTC m=+0.051299205 container remove c1369fcc9898edb2eac91f527f40bbfbd33979cce93acc06ab4191f6e4b1991c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5e62496-3107-4a7c-9c99-52197cb36a02, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  5 08:10:47 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:47.769 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[b081a457-47a4-42dd-8db3-0eb6590a1dab]: (4, ('Fri Dec  5 01:10:47 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-b5e62496-3107-4a7c-9c99-52197cb36a02 (c1369fcc9898edb2eac91f527f40bbfbd33979cce93acc06ab4191f6e4b1991c)\nc1369fcc9898edb2eac91f527f40bbfbd33979cce93acc06ab4191f6e4b1991c\nFri Dec  5 01:10:47 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b5e62496-3107-4a7c-9c99-52197cb36a02 (c1369fcc9898edb2eac91f527f40bbfbd33979cce93acc06ab4191f6e4b1991c)\nc1369fcc9898edb2eac91f527f40bbfbd33979cce93acc06ab4191f6e4b1991c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:10:47 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:47.771 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[7da73c37-f413-4f5d-9aa6-73a68f00a983]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:10:47 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:47.772 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5e62496-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:10:47 np0005546954 nova_compute[187160]: 2025-12-05 13:10:47.796 187164 DEBUG nova.virt.libvirt.guest [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '769e4da5-19b5-4fc2-a50f-79ae93eaac71' (instance-0000001b) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Dec  5 08:10:47 np0005546954 nova_compute[187160]: 2025-12-05 13:10:47.796 187164 INFO nova.virt.libvirt.driver [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Migration operation has completed#033[00m
Dec  5 08:10:47 np0005546954 nova_compute[187160]: 2025-12-05 13:10:47.796 187164 INFO nova.compute.manager [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] _post_live_migration() is started..#033[00m
Dec  5 08:10:47 np0005546954 nova_compute[187160]: 2025-12-05 13:10:47.823 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:10:47 np0005546954 kernel: tapb5e62496-30: left promiscuous mode
Dec  5 08:10:47 np0005546954 nova_compute[187160]: 2025-12-05 13:10:47.846 187164 DEBUG nova.compute.manager [req-245a4c2d-7845-40ed-86c1-b172915068e0 req-d26f4ffa-eff1-40e5-9bcd-8f3f54335b24 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Received event network-vif-unplugged-bf6c2a48-999b-4820-9631-dd3dfff5caac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:10:47 np0005546954 nova_compute[187160]: 2025-12-05 13:10:47.846 187164 DEBUG oslo_concurrency.lockutils [req-245a4c2d-7845-40ed-86c1-b172915068e0 req-d26f4ffa-eff1-40e5-9bcd-8f3f54335b24 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "769e4da5-19b5-4fc2-a50f-79ae93eaac71-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:10:47 np0005546954 nova_compute[187160]: 2025-12-05 13:10:47.847 187164 DEBUG oslo_concurrency.lockutils [req-245a4c2d-7845-40ed-86c1-b172915068e0 req-d26f4ffa-eff1-40e5-9bcd-8f3f54335b24 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "769e4da5-19b5-4fc2-a50f-79ae93eaac71-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:10:47 np0005546954 nova_compute[187160]: 2025-12-05 13:10:47.847 187164 DEBUG oslo_concurrency.lockutils [req-245a4c2d-7845-40ed-86c1-b172915068e0 req-d26f4ffa-eff1-40e5-9bcd-8f3f54335b24 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "769e4da5-19b5-4fc2-a50f-79ae93eaac71-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:10:47 np0005546954 nova_compute[187160]: 2025-12-05 13:10:47.847 187164 DEBUG nova.compute.manager [req-245a4c2d-7845-40ed-86c1-b172915068e0 req-d26f4ffa-eff1-40e5-9bcd-8f3f54335b24 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] No waiting events found dispatching network-vif-unplugged-bf6c2a48-999b-4820-9631-dd3dfff5caac pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 08:10:47 np0005546954 nova_compute[187160]: 2025-12-05 13:10:47.847 187164 DEBUG nova.compute.manager [req-245a4c2d-7845-40ed-86c1-b172915068e0 req-d26f4ffa-eff1-40e5-9bcd-8f3f54335b24 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Received event network-vif-unplugged-bf6c2a48-999b-4820-9631-dd3dfff5caac for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  5 08:10:47 np0005546954 nova_compute[187160]: 2025-12-05 13:10:47.847 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:10:47 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:47.849 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[bf33c4ef-376d-4d20-a98f-a697678d1460]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:10:47 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:47.869 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[21dbbb99-2fa1-4c16-b494-db5279abca7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:10:47 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:47.871 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[d77e462c-2a0e-4230-9c1c-3d0850f6071a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:10:47 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:47.889 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[9b23b02e-c747-41c9-8ef9-51a8d42f3c24]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 520993, 'reachable_time': 43256, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219307, 'error': None, 'target': 'ovnmeta-b5e62496-3107-4a7c-9c99-52197cb36a02', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:10:47 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:47.893 104542 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b5e62496-3107-4a7c-9c99-52197cb36a02 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 08:10:47 np0005546954 systemd[1]: run-netns-ovnmeta\x2db5e62496\x2d3107\x2d4a7c\x2d9c99\x2d52197cb36a02.mount: Deactivated successfully.
Dec  5 08:10:47 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:10:47.893 104542 DEBUG oslo.privsep.daemon [-] privsep: reply[cc40c467-a41d-4a1e-b815-8b024024ecd7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:10:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:10:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:10:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:10:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:10:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:10:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:10:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:10:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:10:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:10:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:10:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:10:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:10:49 np0005546954 nova_compute[187160]: 2025-12-05 13:10:49.595 187164 DEBUG nova.network.neutron [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Activated binding for port bf6c2a48-999b-4820-9631-dd3dfff5caac and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Dec  5 08:10:49 np0005546954 nova_compute[187160]: 2025-12-05 13:10:49.595 187164 DEBUG nova.compute.manager [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "bf6c2a48-999b-4820-9631-dd3dfff5caac", "address": "fa:16:3e:e4:68:0e", "network": {"id": "b5e62496-3107-4a7c-9c99-52197cb36a02", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2065334195-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "876dc65bd147462083d316c47ce85516", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf6c2a48-99", "ovs_interfaceid": "bf6c2a48-999b-4820-9631-dd3dfff5caac", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Dec  5 08:10:49 np0005546954 nova_compute[187160]: 2025-12-05 13:10:49.596 187164 DEBUG nova.virt.libvirt.vif [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T13:09:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-1045316080',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-1045316080',id=27,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T13:10:05Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='876dc65bd147462083d316c47ce85516',ramdisk_id='',reservation_id='r-1t1mv4oh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-989128504',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-989128504-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T13:10:33Z,user_data=None,user_id='4dd51daacb354f619a7413f8b910294c',uuid=769e4da5-19b5-4fc2-a50f-79ae93eaac71,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bf6c2a48-999b-4820-9631-dd3dfff5caac", "address": "fa:16:3e:e4:68:0e", "network": {"id": "b5e62496-3107-4a7c-9c99-52197cb36a02", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2065334195-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "876dc65bd147462083d316c47ce85516", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf6c2a48-99", "ovs_interfaceid": "bf6c2a48-999b-4820-9631-dd3dfff5caac", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 08:10:49 np0005546954 nova_compute[187160]: 2025-12-05 13:10:49.597 187164 DEBUG nova.network.os_vif_util [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Converting VIF {"id": "bf6c2a48-999b-4820-9631-dd3dfff5caac", "address": "fa:16:3e:e4:68:0e", "network": {"id": "b5e62496-3107-4a7c-9c99-52197cb36a02", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2065334195-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "876dc65bd147462083d316c47ce85516", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf6c2a48-99", "ovs_interfaceid": "bf6c2a48-999b-4820-9631-dd3dfff5caac", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 08:10:49 np0005546954 nova_compute[187160]: 2025-12-05 13:10:49.598 187164 DEBUG nova.network.os_vif_util [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e4:68:0e,bridge_name='br-int',has_traffic_filtering=True,id=bf6c2a48-999b-4820-9631-dd3dfff5caac,network=Network(b5e62496-3107-4a7c-9c99-52197cb36a02),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbf6c2a48-99') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 08:10:49 np0005546954 nova_compute[187160]: 2025-12-05 13:10:49.598 187164 DEBUG os_vif [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e4:68:0e,bridge_name='br-int',has_traffic_filtering=True,id=bf6c2a48-999b-4820-9631-dd3dfff5caac,network=Network(b5e62496-3107-4a7c-9c99-52197cb36a02),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbf6c2a48-99') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 08:10:49 np0005546954 nova_compute[187160]: 2025-12-05 13:10:49.601 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:10:49 np0005546954 nova_compute[187160]: 2025-12-05 13:10:49.601 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbf6c2a48-99, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:10:49 np0005546954 nova_compute[187160]: 2025-12-05 13:10:49.603 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:10:49 np0005546954 nova_compute[187160]: 2025-12-05 13:10:49.604 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:10:49 np0005546954 nova_compute[187160]: 2025-12-05 13:10:49.608 187164 INFO os_vif [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e4:68:0e,bridge_name='br-int',has_traffic_filtering=True,id=bf6c2a48-999b-4820-9631-dd3dfff5caac,network=Network(b5e62496-3107-4a7c-9c99-52197cb36a02),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbf6c2a48-99')#033[00m
Dec  5 08:10:49 np0005546954 nova_compute[187160]: 2025-12-05 13:10:49.609 187164 DEBUG oslo_concurrency.lockutils [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:10:49 np0005546954 nova_compute[187160]: 2025-12-05 13:10:49.609 187164 DEBUG oslo_concurrency.lockutils [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:10:49 np0005546954 nova_compute[187160]: 2025-12-05 13:10:49.610 187164 DEBUG oslo_concurrency.lockutils [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:10:49 np0005546954 nova_compute[187160]: 2025-12-05 13:10:49.610 187164 DEBUG nova.compute.manager [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Dec  5 08:10:49 np0005546954 nova_compute[187160]: 2025-12-05 13:10:49.611 187164 INFO nova.virt.libvirt.driver [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Deleting instance files /var/lib/nova/instances/769e4da5-19b5-4fc2-a50f-79ae93eaac71_del#033[00m
Dec  5 08:10:49 np0005546954 nova_compute[187160]: 2025-12-05 13:10:49.612 187164 INFO nova.virt.libvirt.driver [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Deletion of /var/lib/nova/instances/769e4da5-19b5-4fc2-a50f-79ae93eaac71_del complete#033[00m
Dec  5 08:10:50 np0005546954 nova_compute[187160]: 2025-12-05 13:10:50.930 187164 DEBUG nova.compute.manager [req-970b2676-4437-4d90-8188-c67a99d32656 req-22f90dde-5cfc-428a-97cb-052a9aa5b94c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Received event network-vif-plugged-bf6c2a48-999b-4820-9631-dd3dfff5caac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:10:50 np0005546954 nova_compute[187160]: 2025-12-05 13:10:50.930 187164 DEBUG oslo_concurrency.lockutils [req-970b2676-4437-4d90-8188-c67a99d32656 req-22f90dde-5cfc-428a-97cb-052a9aa5b94c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "769e4da5-19b5-4fc2-a50f-79ae93eaac71-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:10:50 np0005546954 nova_compute[187160]: 2025-12-05 13:10:50.931 187164 DEBUG oslo_concurrency.lockutils [req-970b2676-4437-4d90-8188-c67a99d32656 req-22f90dde-5cfc-428a-97cb-052a9aa5b94c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "769e4da5-19b5-4fc2-a50f-79ae93eaac71-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:10:50 np0005546954 nova_compute[187160]: 2025-12-05 13:10:50.931 187164 DEBUG oslo_concurrency.lockutils [req-970b2676-4437-4d90-8188-c67a99d32656 req-22f90dde-5cfc-428a-97cb-052a9aa5b94c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "769e4da5-19b5-4fc2-a50f-79ae93eaac71-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:10:50 np0005546954 nova_compute[187160]: 2025-12-05 13:10:50.931 187164 DEBUG nova.compute.manager [req-970b2676-4437-4d90-8188-c67a99d32656 req-22f90dde-5cfc-428a-97cb-052a9aa5b94c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] No waiting events found dispatching network-vif-plugged-bf6c2a48-999b-4820-9631-dd3dfff5caac pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 08:10:50 np0005546954 nova_compute[187160]: 2025-12-05 13:10:50.931 187164 WARNING nova.compute.manager [req-970b2676-4437-4d90-8188-c67a99d32656 req-22f90dde-5cfc-428a-97cb-052a9aa5b94c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Received unexpected event network-vif-plugged-bf6c2a48-999b-4820-9631-dd3dfff5caac for instance with vm_state active and task_state migrating.#033[00m
Dec  5 08:10:50 np0005546954 nova_compute[187160]: 2025-12-05 13:10:50.931 187164 DEBUG nova.compute.manager [req-970b2676-4437-4d90-8188-c67a99d32656 req-22f90dde-5cfc-428a-97cb-052a9aa5b94c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Received event network-vif-unplugged-bf6c2a48-999b-4820-9631-dd3dfff5caac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:10:50 np0005546954 nova_compute[187160]: 2025-12-05 13:10:50.932 187164 DEBUG oslo_concurrency.lockutils [req-970b2676-4437-4d90-8188-c67a99d32656 req-22f90dde-5cfc-428a-97cb-052a9aa5b94c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "769e4da5-19b5-4fc2-a50f-79ae93eaac71-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:10:50 np0005546954 nova_compute[187160]: 2025-12-05 13:10:50.932 187164 DEBUG oslo_concurrency.lockutils [req-970b2676-4437-4d90-8188-c67a99d32656 req-22f90dde-5cfc-428a-97cb-052a9aa5b94c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "769e4da5-19b5-4fc2-a50f-79ae93eaac71-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:10:50 np0005546954 nova_compute[187160]: 2025-12-05 13:10:50.932 187164 DEBUG oslo_concurrency.lockutils [req-970b2676-4437-4d90-8188-c67a99d32656 req-22f90dde-5cfc-428a-97cb-052a9aa5b94c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "769e4da5-19b5-4fc2-a50f-79ae93eaac71-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:10:50 np0005546954 nova_compute[187160]: 2025-12-05 13:10:50.932 187164 DEBUG nova.compute.manager [req-970b2676-4437-4d90-8188-c67a99d32656 req-22f90dde-5cfc-428a-97cb-052a9aa5b94c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] No waiting events found dispatching network-vif-unplugged-bf6c2a48-999b-4820-9631-dd3dfff5caac pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 08:10:50 np0005546954 nova_compute[187160]: 2025-12-05 13:10:50.933 187164 DEBUG nova.compute.manager [req-970b2676-4437-4d90-8188-c67a99d32656 req-22f90dde-5cfc-428a-97cb-052a9aa5b94c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Received event network-vif-unplugged-bf6c2a48-999b-4820-9631-dd3dfff5caac for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  5 08:10:50 np0005546954 nova_compute[187160]: 2025-12-05 13:10:50.933 187164 DEBUG nova.compute.manager [req-970b2676-4437-4d90-8188-c67a99d32656 req-22f90dde-5cfc-428a-97cb-052a9aa5b94c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Received event network-vif-plugged-bf6c2a48-999b-4820-9631-dd3dfff5caac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:10:50 np0005546954 nova_compute[187160]: 2025-12-05 13:10:50.933 187164 DEBUG oslo_concurrency.lockutils [req-970b2676-4437-4d90-8188-c67a99d32656 req-22f90dde-5cfc-428a-97cb-052a9aa5b94c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "769e4da5-19b5-4fc2-a50f-79ae93eaac71-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:10:50 np0005546954 nova_compute[187160]: 2025-12-05 13:10:50.933 187164 DEBUG oslo_concurrency.lockutils [req-970b2676-4437-4d90-8188-c67a99d32656 req-22f90dde-5cfc-428a-97cb-052a9aa5b94c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "769e4da5-19b5-4fc2-a50f-79ae93eaac71-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:10:50 np0005546954 nova_compute[187160]: 2025-12-05 13:10:50.933 187164 DEBUG oslo_concurrency.lockutils [req-970b2676-4437-4d90-8188-c67a99d32656 req-22f90dde-5cfc-428a-97cb-052a9aa5b94c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "769e4da5-19b5-4fc2-a50f-79ae93eaac71-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:10:50 np0005546954 nova_compute[187160]: 2025-12-05 13:10:50.934 187164 DEBUG nova.compute.manager [req-970b2676-4437-4d90-8188-c67a99d32656 req-22f90dde-5cfc-428a-97cb-052a9aa5b94c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] No waiting events found dispatching network-vif-plugged-bf6c2a48-999b-4820-9631-dd3dfff5caac pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 08:10:50 np0005546954 nova_compute[187160]: 2025-12-05 13:10:50.934 187164 WARNING nova.compute.manager [req-970b2676-4437-4d90-8188-c67a99d32656 req-22f90dde-5cfc-428a-97cb-052a9aa5b94c 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Received unexpected event network-vif-plugged-bf6c2a48-999b-4820-9631-dd3dfff5caac for instance with vm_state active and task_state migrating.#033[00m
Dec  5 08:10:51 np0005546954 nova_compute[187160]: 2025-12-05 13:10:51.406 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:10:53 np0005546954 nova_compute[187160]: 2025-12-05 13:10:53.114 187164 DEBUG nova.compute.manager [req-2d66a95b-ead8-45c1-a680-245e816bfeea req-6811d9e0-7d9a-4fb6-ac03-d69f26166ff3 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Received event network-vif-plugged-bf6c2a48-999b-4820-9631-dd3dfff5caac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:10:53 np0005546954 nova_compute[187160]: 2025-12-05 13:10:53.115 187164 DEBUG oslo_concurrency.lockutils [req-2d66a95b-ead8-45c1-a680-245e816bfeea req-6811d9e0-7d9a-4fb6-ac03-d69f26166ff3 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "769e4da5-19b5-4fc2-a50f-79ae93eaac71-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:10:53 np0005546954 nova_compute[187160]: 2025-12-05 13:10:53.115 187164 DEBUG oslo_concurrency.lockutils [req-2d66a95b-ead8-45c1-a680-245e816bfeea req-6811d9e0-7d9a-4fb6-ac03-d69f26166ff3 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "769e4da5-19b5-4fc2-a50f-79ae93eaac71-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:10:53 np0005546954 nova_compute[187160]: 2025-12-05 13:10:53.116 187164 DEBUG oslo_concurrency.lockutils [req-2d66a95b-ead8-45c1-a680-245e816bfeea req-6811d9e0-7d9a-4fb6-ac03-d69f26166ff3 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "769e4da5-19b5-4fc2-a50f-79ae93eaac71-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:10:53 np0005546954 nova_compute[187160]: 2025-12-05 13:10:53.116 187164 DEBUG nova.compute.manager [req-2d66a95b-ead8-45c1-a680-245e816bfeea req-6811d9e0-7d9a-4fb6-ac03-d69f26166ff3 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] No waiting events found dispatching network-vif-plugged-bf6c2a48-999b-4820-9631-dd3dfff5caac pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 08:10:53 np0005546954 nova_compute[187160]: 2025-12-05 13:10:53.116 187164 WARNING nova.compute.manager [req-2d66a95b-ead8-45c1-a680-245e816bfeea req-6811d9e0-7d9a-4fb6-ac03-d69f26166ff3 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Received unexpected event network-vif-plugged-bf6c2a48-999b-4820-9631-dd3dfff5caac for instance with vm_state active and task_state migrating.#033[00m
Dec  5 08:10:53 np0005546954 nova_compute[187160]: 2025-12-05 13:10:53.117 187164 DEBUG nova.compute.manager [req-2d66a95b-ead8-45c1-a680-245e816bfeea req-6811d9e0-7d9a-4fb6-ac03-d69f26166ff3 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Received event network-vif-plugged-bf6c2a48-999b-4820-9631-dd3dfff5caac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:10:53 np0005546954 nova_compute[187160]: 2025-12-05 13:10:53.117 187164 DEBUG oslo_concurrency.lockutils [req-2d66a95b-ead8-45c1-a680-245e816bfeea req-6811d9e0-7d9a-4fb6-ac03-d69f26166ff3 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "769e4da5-19b5-4fc2-a50f-79ae93eaac71-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:10:53 np0005546954 nova_compute[187160]: 2025-12-05 13:10:53.118 187164 DEBUG oslo_concurrency.lockutils [req-2d66a95b-ead8-45c1-a680-245e816bfeea req-6811d9e0-7d9a-4fb6-ac03-d69f26166ff3 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "769e4da5-19b5-4fc2-a50f-79ae93eaac71-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:10:53 np0005546954 nova_compute[187160]: 2025-12-05 13:10:53.118 187164 DEBUG oslo_concurrency.lockutils [req-2d66a95b-ead8-45c1-a680-245e816bfeea req-6811d9e0-7d9a-4fb6-ac03-d69f26166ff3 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "769e4da5-19b5-4fc2-a50f-79ae93eaac71-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:10:53 np0005546954 nova_compute[187160]: 2025-12-05 13:10:53.119 187164 DEBUG nova.compute.manager [req-2d66a95b-ead8-45c1-a680-245e816bfeea req-6811d9e0-7d9a-4fb6-ac03-d69f26166ff3 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] No waiting events found dispatching network-vif-plugged-bf6c2a48-999b-4820-9631-dd3dfff5caac pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 08:10:53 np0005546954 nova_compute[187160]: 2025-12-05 13:10:53.119 187164 WARNING nova.compute.manager [req-2d66a95b-ead8-45c1-a680-245e816bfeea req-6811d9e0-7d9a-4fb6-ac03-d69f26166ff3 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Received unexpected event network-vif-plugged-bf6c2a48-999b-4820-9631-dd3dfff5caac for instance with vm_state active and task_state migrating.#033[00m
Dec  5 08:10:53 np0005546954 systemd[1]: Stopping User Manager for UID 42436...
Dec  5 08:10:53 np0005546954 systemd[219190]: Activating special unit Exit the Session...
Dec  5 08:10:53 np0005546954 systemd[219190]: Stopped target Main User Target.
Dec  5 08:10:53 np0005546954 systemd[219190]: Stopped target Basic System.
Dec  5 08:10:53 np0005546954 systemd[219190]: Stopped target Paths.
Dec  5 08:10:53 np0005546954 systemd[219190]: Stopped target Sockets.
Dec  5 08:10:53 np0005546954 systemd[219190]: Stopped target Timers.
Dec  5 08:10:53 np0005546954 systemd[219190]: Stopped Mark boot as successful after the user session has run 2 minutes.
Dec  5 08:10:53 np0005546954 systemd[219190]: Stopped Daily Cleanup of User's Temporary Directories.
Dec  5 08:10:53 np0005546954 systemd[219190]: Closed D-Bus User Message Bus Socket.
Dec  5 08:10:53 np0005546954 systemd[219190]: Stopped Create User's Volatile Files and Directories.
Dec  5 08:10:53 np0005546954 systemd[219190]: Removed slice User Application Slice.
Dec  5 08:10:53 np0005546954 systemd[219190]: Reached target Shutdown.
Dec  5 08:10:53 np0005546954 systemd[219190]: Finished Exit the Session.
Dec  5 08:10:53 np0005546954 systemd[219190]: Reached target Exit the Session.
Dec  5 08:10:53 np0005546954 systemd[1]: user@42436.service: Deactivated successfully.
Dec  5 08:10:53 np0005546954 systemd[1]: Stopped User Manager for UID 42436.
Dec  5 08:10:53 np0005546954 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Dec  5 08:10:53 np0005546954 systemd[1]: run-user-42436.mount: Deactivated successfully.
Dec  5 08:10:53 np0005546954 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Dec  5 08:10:53 np0005546954 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Dec  5 08:10:53 np0005546954 systemd[1]: Removed slice User Slice of UID 42436.
Dec  5 08:10:53 np0005546954 podman[219310]: 2025-12-05 13:10:53.581205241 +0000 UTC m=+0.075754145 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec  5 08:10:54 np0005546954 nova_compute[187160]: 2025-12-05 13:10:54.603 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:10:56 np0005546954 nova_compute[187160]: 2025-12-05 13:10:56.154 187164 DEBUG oslo_concurrency.lockutils [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Acquiring lock "769e4da5-19b5-4fc2-a50f-79ae93eaac71-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:10:56 np0005546954 nova_compute[187160]: 2025-12-05 13:10:56.155 187164 DEBUG oslo_concurrency.lockutils [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Lock "769e4da5-19b5-4fc2-a50f-79ae93eaac71-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:10:56 np0005546954 nova_compute[187160]: 2025-12-05 13:10:56.156 187164 DEBUG oslo_concurrency.lockutils [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Lock "769e4da5-19b5-4fc2-a50f-79ae93eaac71-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:10:56 np0005546954 nova_compute[187160]: 2025-12-05 13:10:56.181 187164 DEBUG oslo_concurrency.lockutils [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:10:56 np0005546954 nova_compute[187160]: 2025-12-05 13:10:56.182 187164 DEBUG oslo_concurrency.lockutils [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:10:56 np0005546954 nova_compute[187160]: 2025-12-05 13:10:56.182 187164 DEBUG oslo_concurrency.lockutils [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:10:56 np0005546954 nova_compute[187160]: 2025-12-05 13:10:56.183 187164 DEBUG nova.compute.resource_tracker [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 08:10:56 np0005546954 nova_compute[187160]: 2025-12-05 13:10:56.360 187164 WARNING nova.virt.libvirt.driver [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 08:10:56 np0005546954 nova_compute[187160]: 2025-12-05 13:10:56.361 187164 DEBUG nova.compute.resource_tracker [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5858MB free_disk=73.32942199707031GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 08:10:56 np0005546954 nova_compute[187160]: 2025-12-05 13:10:56.361 187164 DEBUG oslo_concurrency.lockutils [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:10:56 np0005546954 nova_compute[187160]: 2025-12-05 13:10:56.362 187164 DEBUG oslo_concurrency.lockutils [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:10:56 np0005546954 nova_compute[187160]: 2025-12-05 13:10:56.407 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:10:56 np0005546954 nova_compute[187160]: 2025-12-05 13:10:56.704 187164 DEBUG nova.compute.resource_tracker [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Migration for instance 769e4da5-19b5-4fc2-a50f-79ae93eaac71 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Dec  5 08:10:57 np0005546954 nova_compute[187160]: 2025-12-05 13:10:57.049 187164 DEBUG nova.compute.resource_tracker [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Dec  5 08:10:57 np0005546954 nova_compute[187160]: 2025-12-05 13:10:57.093 187164 DEBUG nova.compute.resource_tracker [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Migration 9189c7d2-a0ea-4906-b593-db7025a8d440 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Dec  5 08:10:57 np0005546954 nova_compute[187160]: 2025-12-05 13:10:57.094 187164 DEBUG nova.compute.resource_tracker [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 08:10:57 np0005546954 nova_compute[187160]: 2025-12-05 13:10:57.094 187164 DEBUG nova.compute.resource_tracker [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 08:10:57 np0005546954 nova_compute[187160]: 2025-12-05 13:10:57.182 187164 DEBUG nova.compute.provider_tree [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 08:10:57 np0005546954 nova_compute[187160]: 2025-12-05 13:10:57.538 187164 DEBUG nova.scheduler.client.report [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 08:10:59 np0005546954 nova_compute[187160]: 2025-12-05 13:10:59.395 187164 DEBUG nova.compute.resource_tracker [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 08:10:59 np0005546954 nova_compute[187160]: 2025-12-05 13:10:59.396 187164 DEBUG oslo_concurrency.lockutils [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.034s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:10:59 np0005546954 nova_compute[187160]: 2025-12-05 13:10:59.401 187164 INFO nova.compute.manager [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Migrating instance to compute-0.ctlplane.example.com finished successfully.#033[00m
Dec  5 08:10:59 np0005546954 nova_compute[187160]: 2025-12-05 13:10:59.606 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:11:00 np0005546954 podman[219334]: 2025-12-05 13:11:00.563893253 +0000 UTC m=+0.063174284 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 08:11:00 np0005546954 podman[219333]: 2025-12-05 13:11:00.586889708 +0000 UTC m=+0.092934139 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  5 08:11:01 np0005546954 nova_compute[187160]: 2025-12-05 13:11:01.457 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:11:01 np0005546954 nova_compute[187160]: 2025-12-05 13:11:01.562 187164 INFO nova.scheduler.client.report [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Deleted allocation for migration 9189c7d2-a0ea-4906-b593-db7025a8d440#033[00m
Dec  5 08:11:01 np0005546954 nova_compute[187160]: 2025-12-05 13:11:01.563 187164 DEBUG nova.virt.libvirt.driver [None req-5ab51508-6404-4df1-b0b7-ba7cbc5c3aec 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Dec  5 08:11:02 np0005546954 nova_compute[187160]: 2025-12-05 13:11:02.579 187164 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764940247.5706596, 769e4da5-19b5-4fc2-a50f-79ae93eaac71 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 08:11:02 np0005546954 nova_compute[187160]: 2025-12-05 13:11:02.580 187164 INFO nova.compute.manager [-] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] VM Stopped (Lifecycle Event)#033[00m
Dec  5 08:11:02 np0005546954 nova_compute[187160]: 2025-12-05 13:11:02.682 187164 DEBUG nova.compute.manager [None req-eb8f87a1-ad45-4445-b958-6a58ecf9a920 - - - - - -] [instance: 769e4da5-19b5-4fc2-a50f-79ae93eaac71] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 08:11:04 np0005546954 nova_compute[187160]: 2025-12-05 13:11:04.609 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:11:05 np0005546954 podman[197513]: time="2025-12-05T13:11:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:11:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:11:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 08:11:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:11:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2596 "" "Go-http-client/1.1"
Dec  5 08:11:06 np0005546954 nova_compute[187160]: 2025-12-05 13:11:06.461 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:11:09 np0005546954 nova_compute[187160]: 2025-12-05 13:11:09.612 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:11:11 np0005546954 nova_compute[187160]: 2025-12-05 13:11:11.462 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:11:11 np0005546954 podman[219385]: 2025-12-05 13:11:11.543067211 +0000 UTC m=+0.053443772 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, container_name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.buildah.version=1.33.7, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc.)
Dec  5 08:11:11 np0005546954 podman[219386]: 2025-12-05 13:11:11.548094917 +0000 UTC m=+0.055019170 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_id=multipathd)
Dec  5 08:11:14 np0005546954 nova_compute[187160]: 2025-12-05 13:11:14.615 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:11:16 np0005546954 nova_compute[187160]: 2025-12-05 13:11:16.419 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:11:16 np0005546954 nova_compute[187160]: 2025-12-05 13:11:16.420 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 08:11:16 np0005546954 nova_compute[187160]: 2025-12-05 13:11:16.420 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 08:11:16 np0005546954 nova_compute[187160]: 2025-12-05 13:11:16.439 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 08:11:16 np0005546954 nova_compute[187160]: 2025-12-05 13:11:16.463 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:11:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:11:16.976 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:11:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:11:16.977 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:11:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:11:16.977 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:11:19 np0005546954 nova_compute[187160]: 2025-12-05 13:11:19.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:11:19 np0005546954 nova_compute[187160]: 2025-12-05 13:11:19.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:11:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:11:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:11:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:11:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:11:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:11:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:11:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:11:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:11:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:11:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:11:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:11:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:11:19 np0005546954 nova_compute[187160]: 2025-12-05 13:11:19.617 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:11:20 np0005546954 nova_compute[187160]: 2025-12-05 13:11:20.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:11:21 np0005546954 nova_compute[187160]: 2025-12-05 13:11:21.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:11:21 np0005546954 nova_compute[187160]: 2025-12-05 13:11:21.463 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:11:22 np0005546954 nova_compute[187160]: 2025-12-05 13:11:22.034 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:11:22 np0005546954 nova_compute[187160]: 2025-12-05 13:11:22.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:11:24 np0005546954 podman[219426]: 2025-12-05 13:11:24.556340716 +0000 UTC m=+0.067374244 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 08:11:24 np0005546954 nova_compute[187160]: 2025-12-05 13:11:24.563 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:11:24 np0005546954 nova_compute[187160]: 2025-12-05 13:11:24.563 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:11:24 np0005546954 nova_compute[187160]: 2025-12-05 13:11:24.564 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:11:24 np0005546954 nova_compute[187160]: 2025-12-05 13:11:24.564 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 08:11:24 np0005546954 nova_compute[187160]: 2025-12-05 13:11:24.619 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:11:24 np0005546954 nova_compute[187160]: 2025-12-05 13:11:24.734 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 08:11:24 np0005546954 nova_compute[187160]: 2025-12-05 13:11:24.736 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5866MB free_disk=73.32938385009766GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 08:11:24 np0005546954 nova_compute[187160]: 2025-12-05 13:11:24.736 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:11:24 np0005546954 nova_compute[187160]: 2025-12-05 13:11:24.736 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:11:24 np0005546954 nova_compute[187160]: 2025-12-05 13:11:24.841 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 08:11:24 np0005546954 nova_compute[187160]: 2025-12-05 13:11:24.842 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 08:11:24 np0005546954 nova_compute[187160]: 2025-12-05 13:11:24.873 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 08:11:24 np0005546954 nova_compute[187160]: 2025-12-05 13:11:24.887 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 08:11:24 np0005546954 nova_compute[187160]: 2025-12-05 13:11:24.888 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 08:11:24 np0005546954 nova_compute[187160]: 2025-12-05 13:11:24.889 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:11:26 np0005546954 nova_compute[187160]: 2025-12-05 13:11:26.500 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:11:29 np0005546954 nova_compute[187160]: 2025-12-05 13:11:29.622 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:11:30 np0005546954 nova_compute[187160]: 2025-12-05 13:11:30.889 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:11:30 np0005546954 nova_compute[187160]: 2025-12-05 13:11:30.890 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 08:11:31 np0005546954 nova_compute[187160]: 2025-12-05 13:11:31.035 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:11:31 np0005546954 nova_compute[187160]: 2025-12-05 13:11:31.082 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:11:31 np0005546954 nova_compute[187160]: 2025-12-05 13:11:31.503 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:11:31 np0005546954 podman[219447]: 2025-12-05 13:11:31.559076721 +0000 UTC m=+0.054638628 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  5 08:11:31 np0005546954 podman[219446]: 2025-12-05 13:11:31.63437796 +0000 UTC m=+0.135883612 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Dec  5 08:11:34 np0005546954 nova_compute[187160]: 2025-12-05 13:11:34.625 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:11:35 np0005546954 podman[197513]: time="2025-12-05T13:11:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:11:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:11:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 08:11:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:11:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2596 "" "Go-http-client/1.1"
Dec  5 08:11:36 np0005546954 nova_compute[187160]: 2025-12-05 13:11:36.504 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:11:39 np0005546954 nova_compute[187160]: 2025-12-05 13:11:39.627 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:11:41 np0005546954 ovn_controller[95566]: 2025-12-05T13:11:41Z|00263|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Dec  5 08:11:41 np0005546954 nova_compute[187160]: 2025-12-05 13:11:41.505 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:11:42 np0005546954 podman[219499]: 2025-12-05 13:11:42.561297407 +0000 UTC m=+0.064323010 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 08:11:42 np0005546954 podman[219498]: 2025-12-05 13:11:42.598261795 +0000 UTC m=+0.104889189 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, architecture=x86_64, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, name=ubi9-minimal, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, version=9.6, distribution-scope=public)
Dec  5 08:11:44 np0005546954 nova_compute[187160]: 2025-12-05 13:11:44.630 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:11:46 np0005546954 nova_compute[187160]: 2025-12-05 13:11:46.519 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:11:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:11:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:11:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:11:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:11:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:11:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:11:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:11:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:11:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:11:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:11:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:11:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:11:49 np0005546954 nova_compute[187160]: 2025-12-05 13:11:49.632 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:11:51 np0005546954 nova_compute[187160]: 2025-12-05 13:11:51.522 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:11:54 np0005546954 nova_compute[187160]: 2025-12-05 13:11:54.636 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:11:55 np0005546954 podman[219540]: 2025-12-05 13:11:55.547195583 +0000 UTC m=+0.060143060 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec  5 08:11:56 np0005546954 nova_compute[187160]: 2025-12-05 13:11:56.522 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:11:59 np0005546954 nova_compute[187160]: 2025-12-05 13:11:59.638 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:12:01 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:12:01.382 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2a:56:46', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '32:90:88:ab:74:32'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 08:12:01 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:12:01.383 104428 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 08:12:01 np0005546954 nova_compute[187160]: 2025-12-05 13:12:01.383 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:12:01 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:12:01.384 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f9f74c-08f9-451f-9678-93bb9e8fa2fe, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:12:01 np0005546954 nova_compute[187160]: 2025-12-05 13:12:01.524 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:12:02 np0005546954 podman[219560]: 2025-12-05 13:12:02.580234089 +0000 UTC m=+0.082240616 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 08:12:02 np0005546954 podman[219559]: 2025-12-05 13:12:02.652070521 +0000 UTC m=+0.159308450 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec  5 08:12:04 np0005546954 nova_compute[187160]: 2025-12-05 13:12:04.469 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:12:04 np0005546954 nova_compute[187160]: 2025-12-05 13:12:04.640 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:12:05 np0005546954 podman[197513]: time="2025-12-05T13:12:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:12:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:12:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 08:12:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:12:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2596 "" "Go-http-client/1.1"
Dec  5 08:12:06 np0005546954 nova_compute[187160]: 2025-12-05 13:12:06.525 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:12:09 np0005546954 nova_compute[187160]: 2025-12-05 13:12:09.642 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:12:11 np0005546954 nova_compute[187160]: 2025-12-05 13:12:11.527 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:12:13 np0005546954 podman[219607]: 2025-12-05 13:12:13.579375658 +0000 UTC m=+0.075310490 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 08:12:13 np0005546954 podman[219606]: 2025-12-05 13:12:13.57974512 +0000 UTC m=+0.091797693 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, architecture=x86_64, container_name=openstack_network_exporter, name=ubi9-minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public)
Dec  5 08:12:14 np0005546954 nova_compute[187160]: 2025-12-05 13:12:14.644 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:12:16 np0005546954 nova_compute[187160]: 2025-12-05 13:12:16.528 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:12:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:12:16.978 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:12:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:12:16.980 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:12:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:12:16.980 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:12:17 np0005546954 nova_compute[187160]: 2025-12-05 13:12:17.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:12:17 np0005546954 nova_compute[187160]: 2025-12-05 13:12:17.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 08:12:17 np0005546954 nova_compute[187160]: 2025-12-05 13:12:17.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 08:12:17 np0005546954 nova_compute[187160]: 2025-12-05 13:12:17.406 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 08:12:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:12:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:12:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:12:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:12:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:12:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:12:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:12:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:12:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:12:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:12:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:12:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:12:19 np0005546954 nova_compute[187160]: 2025-12-05 13:12:19.647 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:12:20 np0005546954 nova_compute[187160]: 2025-12-05 13:12:20.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:12:20 np0005546954 nova_compute[187160]: 2025-12-05 13:12:20.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:12:21 np0005546954 nova_compute[187160]: 2025-12-05 13:12:21.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:12:21 np0005546954 nova_compute[187160]: 2025-12-05 13:12:21.531 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:12:22 np0005546954 nova_compute[187160]: 2025-12-05 13:12:22.034 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:12:23 np0005546954 nova_compute[187160]: 2025-12-05 13:12:23.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:12:24 np0005546954 nova_compute[187160]: 2025-12-05 13:12:24.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:12:24 np0005546954 nova_compute[187160]: 2025-12-05 13:12:24.649 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:12:24 np0005546954 nova_compute[187160]: 2025-12-05 13:12:24.651 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:12:24 np0005546954 nova_compute[187160]: 2025-12-05 13:12:24.651 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:12:24 np0005546954 nova_compute[187160]: 2025-12-05 13:12:24.652 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:12:24 np0005546954 nova_compute[187160]: 2025-12-05 13:12:24.652 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 08:12:24 np0005546954 nova_compute[187160]: 2025-12-05 13:12:24.870 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 08:12:24 np0005546954 nova_compute[187160]: 2025-12-05 13:12:24.871 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5868MB free_disk=73.32936477661133GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 08:12:24 np0005546954 nova_compute[187160]: 2025-12-05 13:12:24.871 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:12:24 np0005546954 nova_compute[187160]: 2025-12-05 13:12:24.871 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:12:25 np0005546954 nova_compute[187160]: 2025-12-05 13:12:25.258 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 08:12:25 np0005546954 nova_compute[187160]: 2025-12-05 13:12:25.258 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 08:12:25 np0005546954 nova_compute[187160]: 2025-12-05 13:12:25.283 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 08:12:25 np0005546954 nova_compute[187160]: 2025-12-05 13:12:25.306 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 08:12:25 np0005546954 nova_compute[187160]: 2025-12-05 13:12:25.308 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 08:12:25 np0005546954 nova_compute[187160]: 2025-12-05 13:12:25.309 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.438s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:12:26 np0005546954 nova_compute[187160]: 2025-12-05 13:12:26.531 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:12:26 np0005546954 podman[219648]: 2025-12-05 13:12:26.534056643 +0000 UTC m=+0.052182311 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 08:12:29 np0005546954 nova_compute[187160]: 2025-12-05 13:12:29.652 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:12:30 np0005546954 nova_compute[187160]: 2025-12-05 13:12:30.310 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:12:30 np0005546954 nova_compute[187160]: 2025-12-05 13:12:30.311 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 08:12:31 np0005546954 nova_compute[187160]: 2025-12-05 13:12:31.533 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:12:33 np0005546954 nova_compute[187160]: 2025-12-05 13:12:33.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:12:33 np0005546954 podman[219671]: 2025-12-05 13:12:33.588652749 +0000 UTC m=+0.084905309 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  5 08:12:33 np0005546954 podman[219670]: 2025-12-05 13:12:33.635444872 +0000 UTC m=+0.140382362 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0)
Dec  5 08:12:34 np0005546954 nova_compute[187160]: 2025-12-05 13:12:34.655 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:12:35 np0005546954 podman[197513]: time="2025-12-05T13:12:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:12:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:12:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 08:12:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:12:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2594 "" "Go-http-client/1.1"
Dec  5 08:12:36 np0005546954 nova_compute[187160]: 2025-12-05 13:12:36.535 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:12:39 np0005546954 nova_compute[187160]: 2025-12-05 13:12:39.658 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:12:41 np0005546954 nova_compute[187160]: 2025-12-05 13:12:41.535 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:12:44 np0005546954 ovn_controller[95566]: 2025-12-05T13:12:44Z|00264|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Dec  5 08:12:44 np0005546954 podman[219721]: 2025-12-05 13:12:44.565259799 +0000 UTC m=+0.077269992 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, release=1755695350, vendor=Red Hat, Inc., config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.buildah.version=1.33.7, io.openshift.expose-services=, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, version=9.6)
Dec  5 08:12:44 np0005546954 podman[219722]: 2025-12-05 13:12:44.591082771 +0000 UTC m=+0.091786233 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  5 08:12:44 np0005546954 nova_compute[187160]: 2025-12-05 13:12:44.661 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:12:46 np0005546954 nova_compute[187160]: 2025-12-05 13:12:46.537 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:12:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:12:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:12:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:12:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:12:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:12:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:12:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:12:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:12:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:12:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:12:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:12:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:12:49 np0005546954 nova_compute[187160]: 2025-12-05 13:12:49.662 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:12:51 np0005546954 nova_compute[187160]: 2025-12-05 13:12:51.539 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:12:54 np0005546954 nova_compute[187160]: 2025-12-05 13:12:54.665 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:12:56 np0005546954 nova_compute[187160]: 2025-12-05 13:12:56.540 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:12:57 np0005546954 podman[219763]: 2025-12-05 13:12:57.532147693 +0000 UTC m=+0.050538701 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, managed_by=edpm_ansible)
Dec  5 08:12:59 np0005546954 nova_compute[187160]: 2025-12-05 13:12:59.668 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:13:01 np0005546954 nova_compute[187160]: 2025-12-05 13:13:01.543 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:13:04 np0005546954 podman[219784]: 2025-12-05 13:13:04.562896219 +0000 UTC m=+0.075043463 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 08:13:04 np0005546954 podman[219783]: 2025-12-05 13:13:04.58293057 +0000 UTC m=+0.090997067 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3)
Dec  5 08:13:04 np0005546954 nova_compute[187160]: 2025-12-05 13:13:04.670 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:13:05 np0005546954 podman[197513]: time="2025-12-05T13:13:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:13:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:13:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 08:13:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:13:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2597 "" "Go-http-client/1.1"
Dec  5 08:13:06 np0005546954 nova_compute[187160]: 2025-12-05 13:13:06.545 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:13:09 np0005546954 nova_compute[187160]: 2025-12-05 13:13:09.733 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:13:11 np0005546954 nova_compute[187160]: 2025-12-05 13:13:11.546 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:13:14 np0005546954 nova_compute[187160]: 2025-12-05 13:13:14.735 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:13:15 np0005546954 podman[219831]: 2025-12-05 13:13:15.543023038 +0000 UTC m=+0.052347738 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., config_id=edpm, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, release=1755695350)
Dec  5 08:13:15 np0005546954 podman[219832]: 2025-12-05 13:13:15.574079802 +0000 UTC m=+0.079269873 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec  5 08:13:16 np0005546954 nova_compute[187160]: 2025-12-05 13:13:16.548 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:13:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:13:16.979 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:13:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:13:16.979 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:13:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:13:16.979 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:13:19 np0005546954 nova_compute[187160]: 2025-12-05 13:13:19.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:13:19 np0005546954 nova_compute[187160]: 2025-12-05 13:13:19.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 08:13:19 np0005546954 nova_compute[187160]: 2025-12-05 13:13:19.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 08:13:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:13:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:13:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:13:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:13:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:13:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:13:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:13:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:13:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:13:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:13:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:13:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:13:19 np0005546954 nova_compute[187160]: 2025-12-05 13:13:19.737 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:13:21 np0005546954 nova_compute[187160]: 2025-12-05 13:13:21.550 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:13:22 np0005546954 nova_compute[187160]: 2025-12-05 13:13:22.667 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 08:13:22 np0005546954 nova_compute[187160]: 2025-12-05 13:13:22.668 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:13:22 np0005546954 nova_compute[187160]: 2025-12-05 13:13:22.668 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:13:22 np0005546954 nova_compute[187160]: 2025-12-05 13:13:22.668 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:13:24 np0005546954 nova_compute[187160]: 2025-12-05 13:13:24.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:13:24 np0005546954 nova_compute[187160]: 2025-12-05 13:13:24.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:13:24 np0005546954 nova_compute[187160]: 2025-12-05 13:13:24.739 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:13:26 np0005546954 nova_compute[187160]: 2025-12-05 13:13:26.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:13:26 np0005546954 nova_compute[187160]: 2025-12-05 13:13:26.181 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:13:26 np0005546954 nova_compute[187160]: 2025-12-05 13:13:26.182 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:13:26 np0005546954 nova_compute[187160]: 2025-12-05 13:13:26.182 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:13:26 np0005546954 nova_compute[187160]: 2025-12-05 13:13:26.182 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 08:13:26 np0005546954 nova_compute[187160]: 2025-12-05 13:13:26.356 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 08:13:26 np0005546954 nova_compute[187160]: 2025-12-05 13:13:26.358 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5866MB free_disk=73.32936477661133GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 08:13:26 np0005546954 nova_compute[187160]: 2025-12-05 13:13:26.358 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:13:26 np0005546954 nova_compute[187160]: 2025-12-05 13:13:26.358 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:13:26 np0005546954 nova_compute[187160]: 2025-12-05 13:13:26.552 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:13:26 np0005546954 nova_compute[187160]: 2025-12-05 13:13:26.590 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 08:13:26 np0005546954 nova_compute[187160]: 2025-12-05 13:13:26.591 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 08:13:26 np0005546954 nova_compute[187160]: 2025-12-05 13:13:26.612 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 08:13:26 np0005546954 nova_compute[187160]: 2025-12-05 13:13:26.639 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 08:13:26 np0005546954 nova_compute[187160]: 2025-12-05 13:13:26.642 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 08:13:26 np0005546954 nova_compute[187160]: 2025-12-05 13:13:26.642 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.284s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:13:28 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:13:28.102 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2a:56:46', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '32:90:88:ab:74:32'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 08:13:28 np0005546954 nova_compute[187160]: 2025-12-05 13:13:28.104 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:13:28 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:13:28.105 104428 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 08:13:28 np0005546954 podman[219874]: 2025-12-05 13:13:28.554499188 +0000 UTC m=+0.066540748 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec  5 08:13:29 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:13:29.109 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f9f74c-08f9-451f-9678-93bb9e8fa2fe, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:13:29 np0005546954 nova_compute[187160]: 2025-12-05 13:13:29.643 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:13:29 np0005546954 nova_compute[187160]: 2025-12-05 13:13:29.644 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 08:13:29 np0005546954 nova_compute[187160]: 2025-12-05 13:13:29.741 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:13:31 np0005546954 nova_compute[187160]: 2025-12-05 13:13:31.554 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:13:32 np0005546954 nova_compute[187160]: 2025-12-05 13:13:32.035 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:13:33 np0005546954 nova_compute[187160]: 2025-12-05 13:13:33.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:13:34 np0005546954 nova_compute[187160]: 2025-12-05 13:13:34.744 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:13:35 np0005546954 podman[219894]: 2025-12-05 13:13:35.54521648 +0000 UTC m=+0.058671054 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  5 08:13:35 np0005546954 podman[219893]: 2025-12-05 13:13:35.572162166 +0000 UTC m=+0.088714486 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_controller)
Dec  5 08:13:35 np0005546954 podman[197513]: time="2025-12-05T13:13:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:13:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:13:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 08:13:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:13:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2592 "" "Go-http-client/1.1"
Dec  5 08:13:36 np0005546954 nova_compute[187160]: 2025-12-05 13:13:36.555 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:13:39 np0005546954 nova_compute[187160]: 2025-12-05 13:13:39.747 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:13:41 np0005546954 nova_compute[187160]: 2025-12-05 13:13:41.559 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:13:44 np0005546954 nova_compute[187160]: 2025-12-05 13:13:44.750 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:13:46 np0005546954 nova_compute[187160]: 2025-12-05 13:13:46.560 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:13:46 np0005546954 podman[219947]: 2025-12-05 13:13:46.569094906 +0000 UTC m=+0.081624027 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, vendor=Red Hat, Inc., config_id=edpm, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, release=1755695350, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41)
Dec  5 08:13:46 np0005546954 podman[219948]: 2025-12-05 13:13:46.572742709 +0000 UTC m=+0.073382430 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3)
Dec  5 08:13:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:13:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:13:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:13:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:13:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:13:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:13:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:13:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:13:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:13:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:13:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:13:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:13:49 np0005546954 nova_compute[187160]: 2025-12-05 13:13:49.751 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:13:51 np0005546954 nova_compute[187160]: 2025-12-05 13:13:51.562 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:13:54 np0005546954 nova_compute[187160]: 2025-12-05 13:13:54.754 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:13:56 np0005546954 nova_compute[187160]: 2025-12-05 13:13:56.564 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:13:59 np0005546954 podman[219991]: 2025-12-05 13:13:59.58815451 +0000 UTC m=+0.088385087 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Dec  5 08:13:59 np0005546954 nova_compute[187160]: 2025-12-05 13:13:59.757 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:14:01 np0005546954 nova_compute[187160]: 2025-12-05 13:14:01.598 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:14:04 np0005546954 nova_compute[187160]: 2025-12-05 13:14:04.759 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:14:05 np0005546954 podman[197513]: time="2025-12-05T13:14:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:14:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:14:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 08:14:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:14:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2602 "" "Go-http-client/1.1"
Dec  5 08:14:06 np0005546954 podman[220008]: 2025-12-05 13:14:06.576919259 +0000 UTC m=+0.090712969 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Dec  5 08:14:06 np0005546954 podman[220009]: 2025-12-05 13:14:06.576989521 +0000 UTC m=+0.078960193 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  5 08:14:06 np0005546954 nova_compute[187160]: 2025-12-05 13:14:06.598 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:14:08 np0005546954 nova_compute[187160]: 2025-12-05 13:14:08.347 187164 DEBUG nova.virt.libvirt.driver [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba] Creating tmpfile /var/lib/nova/instances/tmpu9c5lib_ to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Dec  5 08:14:08 np0005546954 nova_compute[187160]: 2025-12-05 13:14:08.348 187164 DEBUG nova.compute.manager [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpu9c5lib_',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Dec  5 08:14:09 np0005546954 nova_compute[187160]: 2025-12-05 13:14:09.795 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:14:10 np0005546954 nova_compute[187160]: 2025-12-05 13:14:10.542 187164 DEBUG nova.compute.manager [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpu9c5lib_',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5e06965f-7c85-4c67-a69b-fd6a4d9c46ba',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Dec  5 08:14:10 np0005546954 nova_compute[187160]: 2025-12-05 13:14:10.576 187164 DEBUG oslo_concurrency.lockutils [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "refresh_cache-5e06965f-7c85-4c67-a69b-fd6a4d9c46ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 08:14:10 np0005546954 nova_compute[187160]: 2025-12-05 13:14:10.577 187164 DEBUG oslo_concurrency.lockutils [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquired lock "refresh_cache-5e06965f-7c85-4c67-a69b-fd6a4d9c46ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 08:14:10 np0005546954 nova_compute[187160]: 2025-12-05 13:14:10.577 187164 DEBUG nova.network.neutron [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 08:14:11 np0005546954 nova_compute[187160]: 2025-12-05 13:14:11.601 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:14:12 np0005546954 nova_compute[187160]: 2025-12-05 13:14:12.589 187164 DEBUG nova.network.neutron [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba] Updating instance_info_cache with network_info: [{"id": "cc69b7d6-8606-4956-8875-fc8b22e057f9", "address": "fa:16:3e:cb:2e:ac", "network": {"id": "dbffd75b-eadb-49d2-adf6-6e9886260fa0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1524978476-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9c39db0d5634bb28b51abbe0e5a2c82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc69b7d6-86", "ovs_interfaceid": "cc69b7d6-8606-4956-8875-fc8b22e057f9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 08:14:13 np0005546954 nova_compute[187160]: 2025-12-05 13:14:13.582 187164 DEBUG oslo_concurrency.lockutils [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Releasing lock "refresh_cache-5e06965f-7c85-4c67-a69b-fd6a4d9c46ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 08:14:13 np0005546954 nova_compute[187160]: 2025-12-05 13:14:13.584 187164 DEBUG nova.virt.libvirt.driver [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpu9c5lib_',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5e06965f-7c85-4c67-a69b-fd6a4d9c46ba',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Dec  5 08:14:13 np0005546954 nova_compute[187160]: 2025-12-05 13:14:13.585 187164 DEBUG nova.virt.libvirt.driver [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba] Creating instance directory: /var/lib/nova/instances/5e06965f-7c85-4c67-a69b-fd6a4d9c46ba pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Dec  5 08:14:13 np0005546954 nova_compute[187160]: 2025-12-05 13:14:13.585 187164 DEBUG nova.virt.libvirt.driver [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba] Creating disk.info with the contents: {'/var/lib/nova/instances/5e06965f-7c85-4c67-a69b-fd6a4d9c46ba/disk': 'qcow2', '/var/lib/nova/instances/5e06965f-7c85-4c67-a69b-fd6a4d9c46ba/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Dec  5 08:14:13 np0005546954 nova_compute[187160]: 2025-12-05 13:14:13.585 187164 DEBUG nova.virt.libvirt.driver [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Dec  5 08:14:13 np0005546954 nova_compute[187160]: 2025-12-05 13:14:13.586 187164 DEBUG nova.objects.instance [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lazy-loading 'trusted_certs' on Instance uuid 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 08:14:13 np0005546954 nova_compute[187160]: 2025-12-05 13:14:13.669 187164 DEBUG oslo_concurrency.processutils [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:14:13 np0005546954 nova_compute[187160]: 2025-12-05 13:14:13.722 187164 DEBUG oslo_concurrency.processutils [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:14:13 np0005546954 nova_compute[187160]: 2025-12-05 13:14:13.723 187164 DEBUG oslo_concurrency.lockutils [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:14:13 np0005546954 nova_compute[187160]: 2025-12-05 13:14:13.724 187164 DEBUG oslo_concurrency.lockutils [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:14:13 np0005546954 nova_compute[187160]: 2025-12-05 13:14:13.740 187164 DEBUG oslo_concurrency.processutils [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:14:13 np0005546954 nova_compute[187160]: 2025-12-05 13:14:13.792 187164 DEBUG oslo_concurrency.processutils [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:14:13 np0005546954 nova_compute[187160]: 2025-12-05 13:14:13.793 187164 DEBUG oslo_concurrency.processutils [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/5e06965f-7c85-4c67-a69b-fd6a4d9c46ba/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:14:13 np0005546954 nova_compute[187160]: 2025-12-05 13:14:13.825 187164 DEBUG oslo_concurrency.processutils [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4,backing_fmt=raw /var/lib/nova/instances/5e06965f-7c85-4c67-a69b-fd6a4d9c46ba/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:14:13 np0005546954 nova_compute[187160]: 2025-12-05 13:14:13.826 187164 DEBUG oslo_concurrency.lockutils [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "c9b7a6045cd59d7d8edb33eed052a432c9b974a4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:14:13 np0005546954 nova_compute[187160]: 2025-12-05 13:14:13.826 187164 DEBUG oslo_concurrency.processutils [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:14:13 np0005546954 nova_compute[187160]: 2025-12-05 13:14:13.901 187164 DEBUG oslo_concurrency.processutils [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:14:13 np0005546954 nova_compute[187160]: 2025-12-05 13:14:13.902 187164 DEBUG nova.virt.disk.api [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Checking if we can resize image /var/lib/nova/instances/5e06965f-7c85-4c67-a69b-fd6a4d9c46ba/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 08:14:13 np0005546954 nova_compute[187160]: 2025-12-05 13:14:13.903 187164 DEBUG oslo_concurrency.processutils [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5e06965f-7c85-4c67-a69b-fd6a4d9c46ba/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:14:13 np0005546954 nova_compute[187160]: 2025-12-05 13:14:13.954 187164 DEBUG oslo_concurrency.processutils [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5e06965f-7c85-4c67-a69b-fd6a4d9c46ba/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:14:13 np0005546954 nova_compute[187160]: 2025-12-05 13:14:13.955 187164 DEBUG nova.virt.disk.api [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Cannot resize image /var/lib/nova/instances/5e06965f-7c85-4c67-a69b-fd6a4d9c46ba/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 08:14:13 np0005546954 nova_compute[187160]: 2025-12-05 13:14:13.956 187164 DEBUG nova.objects.instance [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lazy-loading 'migration_context' on Instance uuid 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 08:14:14 np0005546954 nova_compute[187160]: 2025-12-05 13:14:14.480 187164 DEBUG oslo_concurrency.processutils [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/5e06965f-7c85-4c67-a69b-fd6a4d9c46ba/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:14:14 np0005546954 nova_compute[187160]: 2025-12-05 13:14:14.524 187164 DEBUG oslo_concurrency.processutils [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/5e06965f-7c85-4c67-a69b-fd6a4d9c46ba/disk.config 485376" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:14:14 np0005546954 nova_compute[187160]: 2025-12-05 13:14:14.527 187164 DEBUG nova.virt.libvirt.volume.remotefs [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/5e06965f-7c85-4c67-a69b-fd6a4d9c46ba/disk.config to /var/lib/nova/instances/5e06965f-7c85-4c67-a69b-fd6a4d9c46ba copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Dec  5 08:14:14 np0005546954 nova_compute[187160]: 2025-12-05 13:14:14.527 187164 DEBUG oslo_concurrency.processutils [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/5e06965f-7c85-4c67-a69b-fd6a4d9c46ba/disk.config /var/lib/nova/instances/5e06965f-7c85-4c67-a69b-fd6a4d9c46ba execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:14:14 np0005546954 nova_compute[187160]: 2025-12-05 13:14:14.798 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:14:14 np0005546954 nova_compute[187160]: 2025-12-05 13:14:14.967 187164 DEBUG oslo_concurrency.processutils [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/5e06965f-7c85-4c67-a69b-fd6a4d9c46ba/disk.config /var/lib/nova/instances/5e06965f-7c85-4c67-a69b-fd6a4d9c46ba" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:14:14 np0005546954 nova_compute[187160]: 2025-12-05 13:14:14.969 187164 DEBUG nova.virt.libvirt.driver [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Dec  5 08:14:14 np0005546954 nova_compute[187160]: 2025-12-05 13:14:14.971 187164 DEBUG nova.virt.libvirt.vif [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T13:13:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-347415860',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-347415860',id=29,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T13:13:30Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d9c39db0d5634bb28b51abbe0e5a2c82',ramdisk_id='',reservation_id='r-lzku1gyg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-527033115',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-527033115-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T13:13:30Z,user_data=None,user_id='db8a6eb3223545318fef56cd23348ce4',uuid=5e06965f-7c85-4c67-a69b-fd6a4d9c46ba,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cc69b7d6-8606-4956-8875-fc8b22e057f9", "address": "fa:16:3e:cb:2e:ac", "network": {"id": "dbffd75b-eadb-49d2-adf6-6e9886260fa0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1524978476-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9c39db0d5634bb28b51abbe0e5a2c82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapcc69b7d6-86", "ovs_interfaceid": "cc69b7d6-8606-4956-8875-fc8b22e057f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 08:14:14 np0005546954 nova_compute[187160]: 2025-12-05 13:14:14.972 187164 DEBUG nova.network.os_vif_util [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Converting VIF {"id": "cc69b7d6-8606-4956-8875-fc8b22e057f9", "address": "fa:16:3e:cb:2e:ac", "network": {"id": "dbffd75b-eadb-49d2-adf6-6e9886260fa0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1524978476-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9c39db0d5634bb28b51abbe0e5a2c82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapcc69b7d6-86", "ovs_interfaceid": "cc69b7d6-8606-4956-8875-fc8b22e057f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 08:14:14 np0005546954 nova_compute[187160]: 2025-12-05 13:14:14.974 187164 DEBUG nova.network.os_vif_util [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cb:2e:ac,bridge_name='br-int',has_traffic_filtering=True,id=cc69b7d6-8606-4956-8875-fc8b22e057f9,network=Network(dbffd75b-eadb-49d2-adf6-6e9886260fa0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc69b7d6-86') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 08:14:14 np0005546954 nova_compute[187160]: 2025-12-05 13:14:14.974 187164 DEBUG os_vif [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cb:2e:ac,bridge_name='br-int',has_traffic_filtering=True,id=cc69b7d6-8606-4956-8875-fc8b22e057f9,network=Network(dbffd75b-eadb-49d2-adf6-6e9886260fa0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc69b7d6-86') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 08:14:14 np0005546954 nova_compute[187160]: 2025-12-05 13:14:14.975 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:14:14 np0005546954 nova_compute[187160]: 2025-12-05 13:14:14.976 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:14:14 np0005546954 nova_compute[187160]: 2025-12-05 13:14:14.977 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 08:14:14 np0005546954 nova_compute[187160]: 2025-12-05 13:14:14.981 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:14:14 np0005546954 nova_compute[187160]: 2025-12-05 13:14:14.982 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcc69b7d6-86, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:14:14 np0005546954 nova_compute[187160]: 2025-12-05 13:14:14.982 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcc69b7d6-86, col_values=(('external_ids', {'iface-id': 'cc69b7d6-8606-4956-8875-fc8b22e057f9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cb:2e:ac', 'vm-uuid': '5e06965f-7c85-4c67-a69b-fd6a4d9c46ba'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:14:14 np0005546954 NetworkManager[55665]: <info>  [1764940454.9862] manager: (tapcc69b7d6-86): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/102)
Dec  5 08:14:14 np0005546954 nova_compute[187160]: 2025-12-05 13:14:14.991 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 08:14:14 np0005546954 nova_compute[187160]: 2025-12-05 13:14:14.992 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:14:14 np0005546954 nova_compute[187160]: 2025-12-05 13:14:14.994 187164 INFO os_vif [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cb:2e:ac,bridge_name='br-int',has_traffic_filtering=True,id=cc69b7d6-8606-4956-8875-fc8b22e057f9,network=Network(dbffd75b-eadb-49d2-adf6-6e9886260fa0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc69b7d6-86')#033[00m
Dec  5 08:14:14 np0005546954 nova_compute[187160]: 2025-12-05 13:14:14.995 187164 DEBUG nova.virt.libvirt.driver [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Dec  5 08:14:14 np0005546954 nova_compute[187160]: 2025-12-05 13:14:14.996 187164 DEBUG nova.compute.manager [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpu9c5lib_',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5e06965f-7c85-4c67-a69b-fd6a4d9c46ba',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Dec  5 08:14:16 np0005546954 nova_compute[187160]: 2025-12-05 13:14:16.603 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:14:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:16.980 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:14:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:16.980 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:14:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:16.980 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:14:17 np0005546954 podman[220079]: 2025-12-05 13:14:17.565456739 +0000 UTC m=+0.076731904 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  5 08:14:17 np0005546954 podman[220078]: 2025-12-05 13:14:17.572630532 +0000 UTC m=+0.083053871 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, version=9.6, distribution-scope=public)
Dec  5 08:14:19 np0005546954 nova_compute[187160]: 2025-12-05 13:14:19.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:14:19 np0005546954 nova_compute[187160]: 2025-12-05 13:14:19.039 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 08:14:19 np0005546954 nova_compute[187160]: 2025-12-05 13:14:19.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 08:14:19 np0005546954 nova_compute[187160]: 2025-12-05 13:14:19.176 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 08:14:19 np0005546954 nova_compute[187160]: 2025-12-05 13:14:19.320 187164 DEBUG nova.network.neutron [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba] Port cc69b7d6-8606-4956-8875-fc8b22e057f9 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Dec  5 08:14:19 np0005546954 nova_compute[187160]: 2025-12-05 13:14:19.323 187164 DEBUG nova.compute.manager [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpu9c5lib_',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5e06965f-7c85-4c67-a69b-fd6a4d9c46ba',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Dec  5 08:14:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:14:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:14:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:14:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:14:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:14:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:14:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:14:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:14:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:14:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:14:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:14:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:14:19 np0005546954 nova_compute[187160]: 2025-12-05 13:14:19.986 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:14:20 np0005546954 systemd[1]: Starting libvirt proxy daemon...
Dec  5 08:14:20 np0005546954 systemd[1]: Started libvirt proxy daemon.
Dec  5 08:14:20 np0005546954 kernel: tapcc69b7d6-86: entered promiscuous mode
Dec  5 08:14:20 np0005546954 NetworkManager[55665]: <info>  [1764940460.7982] manager: (tapcc69b7d6-86): new Tun device (/org/freedesktop/NetworkManager/Devices/103)
Dec  5 08:14:20 np0005546954 systemd-udevd[220151]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 08:14:20 np0005546954 ovn_controller[95566]: 2025-12-05T13:14:20Z|00265|binding|INFO|Claiming lport cc69b7d6-8606-4956-8875-fc8b22e057f9 for this additional chassis.
Dec  5 08:14:20 np0005546954 ovn_controller[95566]: 2025-12-05T13:14:20Z|00266|binding|INFO|cc69b7d6-8606-4956-8875-fc8b22e057f9: Claiming fa:16:3e:cb:2e:ac 10.100.0.8
Dec  5 08:14:20 np0005546954 nova_compute[187160]: 2025-12-05 13:14:20.854 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:14:20 np0005546954 nova_compute[187160]: 2025-12-05 13:14:20.879 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:14:20 np0005546954 NetworkManager[55665]: <info>  [1764940460.8817] device (tapcc69b7d6-86): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 08:14:20 np0005546954 NetworkManager[55665]: <info>  [1764940460.8827] device (tapcc69b7d6-86): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 08:14:20 np0005546954 systemd-machined[153497]: New machine qemu-26-instance-0000001d.
Dec  5 08:14:20 np0005546954 ovn_controller[95566]: 2025-12-05T13:14:20Z|00267|binding|INFO|Setting lport cc69b7d6-8606-4956-8875-fc8b22e057f9 ovn-installed in OVS
Dec  5 08:14:20 np0005546954 nova_compute[187160]: 2025-12-05 13:14:20.935 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:14:20 np0005546954 systemd[1]: Started Virtual Machine qemu-26-instance-0000001d.
Dec  5 08:14:21 np0005546954 nova_compute[187160]: 2025-12-05 13:14:21.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:14:21 np0005546954 nova_compute[187160]: 2025-12-05 13:14:21.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:14:21 np0005546954 nova_compute[187160]: 2025-12-05 13:14:21.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:14:21 np0005546954 nova_compute[187160]: 2025-12-05 13:14:21.375 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764940461.3740587, 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 08:14:21 np0005546954 nova_compute[187160]: 2025-12-05 13:14:21.375 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba] VM Started (Lifecycle Event)#033[00m
Dec  5 08:14:21 np0005546954 nova_compute[187160]: 2025-12-05 13:14:21.606 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:14:21 np0005546954 nova_compute[187160]: 2025-12-05 13:14:21.677 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 08:14:22 np0005546954 nova_compute[187160]: 2025-12-05 13:14:22.219 187164 DEBUG nova.virt.driver [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] Emitting event <LifecycleEvent: 1764940462.2195718, 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 08:14:22 np0005546954 nova_compute[187160]: 2025-12-05 13:14:22.220 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba] VM Resumed (Lifecycle Event)#033[00m
Dec  5 08:14:22 np0005546954 nova_compute[187160]: 2025-12-05 13:14:22.592 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 08:14:22 np0005546954 nova_compute[187160]: 2025-12-05 13:14:22.596 187164 DEBUG nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 08:14:22 np0005546954 nova_compute[187160]: 2025-12-05 13:14:22.664 187164 INFO nova.compute.manager [None req-08321cc2-68c5-4c3a-9e69-899b272c4319 - - - - - -] [instance: 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Dec  5 08:14:23 np0005546954 ovn_controller[95566]: 2025-12-05T13:14:23Z|00268|binding|INFO|Claiming lport cc69b7d6-8606-4956-8875-fc8b22e057f9 for this chassis.
Dec  5 08:14:23 np0005546954 ovn_controller[95566]: 2025-12-05T13:14:23Z|00269|binding|INFO|cc69b7d6-8606-4956-8875-fc8b22e057f9: Claiming fa:16:3e:cb:2e:ac 10.100.0.8
Dec  5 08:14:23 np0005546954 ovn_controller[95566]: 2025-12-05T13:14:23Z|00270|binding|INFO|Setting lport cc69b7d6-8606-4956-8875-fc8b22e057f9 up in Southbound
Dec  5 08:14:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:23.670 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cb:2e:ac 10.100.0.8'], port_security=['fa:16:3e:cb:2e:ac 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '5e06965f-7c85-4c67-a69b-fd6a4d9c46ba', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dbffd75b-eadb-49d2-adf6-6e9886260fa0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd9c39db0d5634bb28b51abbe0e5a2c82', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'bc9e66f3-cdee-4f48-b699-4de7d6b9137f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9963a728-1bae-45e9-828f-319602e2a857, chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=cc69b7d6-8606-4956-8875-fc8b22e057f9) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 08:14:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:23.672 104428 INFO neutron.agent.ovn.metadata.agent [-] Port cc69b7d6-8606-4956-8875-fc8b22e057f9 in datapath dbffd75b-eadb-49d2-adf6-6e9886260fa0 bound to our chassis#033[00m
Dec  5 08:14:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:23.673 104428 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dbffd75b-eadb-49d2-adf6-6e9886260fa0#033[00m
Dec  5 08:14:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:23.684 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[3b6df678-ebd4-4ab5-8213-218a6b628acf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:14:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:23.685 104428 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdbffd75b-e1 in ovnmeta-dbffd75b-eadb-49d2-adf6-6e9886260fa0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 08:14:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:23.688 208690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdbffd75b-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 08:14:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:23.688 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[e40a4770-3099-4745-85d0-8a5d09282870]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:14:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:23.689 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[497ed796-72d0-4ada-92a7-103f6e0e8d40]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:14:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:23.701 104542 DEBUG oslo.privsep.daemon [-] privsep: reply[8a3abdce-aab4-490c-8dbe-3ce43b098373]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:14:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:23.714 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[f2a497e0-052f-4353-b8e0-51e3a1ec505d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:14:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:23.743 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[94a08b7f-a350-48b9-b6ea-8527ad5f0585]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:14:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:23.750 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[ff535463-934c-4299-8087-dcb6f9406bba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:14:23 np0005546954 NetworkManager[55665]: <info>  [1764940463.7530] manager: (tapdbffd75b-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/104)
Dec  5 08:14:23 np0005546954 systemd-udevd[220153]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 08:14:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:23.789 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[5d80da09-dc6d-4d7e-9c58-6e7169a6cb02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:14:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:23.793 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[c238d27b-f6e9-4667-bfb5-10c43ea878be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:14:23 np0005546954 NetworkManager[55665]: <info>  [1764940463.8209] device (tapdbffd75b-e0): carrier: link connected
Dec  5 08:14:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:23.830 208711 DEBUG oslo.privsep.daemon [-] privsep: reply[e247bfc9-d96f-4740-b970-61c401a76798]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:14:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:23.861 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[57913a27-967a-4dfc-9c63-3cc9e1a44413]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdbffd75b-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f4:86:d2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 74], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 547044, 'reachable_time': 29028, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220206, 'error': None, 'target': 'ovnmeta-dbffd75b-eadb-49d2-adf6-6e9886260fa0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:14:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:23.881 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[6f50986a-5125-46d7-bbe2-99ced1633de6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef4:86d2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 547044, 'tstamp': 547044}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220207, 'error': None, 'target': 'ovnmeta-dbffd75b-eadb-49d2-adf6-6e9886260fa0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:14:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:23.896 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[0276d38b-3974-4435-b9a1-868a8d011100]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdbffd75b-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f4:86:d2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 74], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 547044, 'reachable_time': 29028, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220208, 'error': None, 'target': 'ovnmeta-dbffd75b-eadb-49d2-adf6-6e9886260fa0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:14:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:23.936 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[52b106ea-d2a0-46c0-9280-64f9250e57ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:14:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:23.987 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[ffc76b93-afa1-4673-8d26-75e9b60ce10d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:14:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:23.990 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdbffd75b-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:14:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:23.990 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 08:14:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:23.991 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdbffd75b-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:14:23 np0005546954 NetworkManager[55665]: <info>  [1764940463.9929] manager: (tapdbffd75b-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/105)
Dec  5 08:14:23 np0005546954 nova_compute[187160]: 2025-12-05 13:14:23.992 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:14:23 np0005546954 kernel: tapdbffd75b-e0: entered promiscuous mode
Dec  5 08:14:23 np0005546954 nova_compute[187160]: 2025-12-05 13:14:23.994 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:14:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:23.996 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdbffd75b-e0, col_values=(('external_ids', {'iface-id': 'b79b8c80-4e9a-41b1-9201-359e1fe4f5a7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:14:23 np0005546954 nova_compute[187160]: 2025-12-05 13:14:23.997 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:14:23 np0005546954 ovn_controller[95566]: 2025-12-05T13:14:23Z|00271|binding|INFO|Releasing lport b79b8c80-4e9a-41b1-9201-359e1fe4f5a7 from this chassis (sb_readonly=0)
Dec  5 08:14:23 np0005546954 nova_compute[187160]: 2025-12-05 13:14:23.998 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:14:23 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:23.999 104428 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dbffd75b-eadb-49d2-adf6-6e9886260fa0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dbffd75b-eadb-49d2-adf6-6e9886260fa0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 08:14:24 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:23.999 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[71db7cde-0f1b-4879-b83a-765ffb8c1bce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:14:24 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:24.000 104428 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 08:14:24 np0005546954 ovn_metadata_agent[104423]: global
Dec  5 08:14:24 np0005546954 ovn_metadata_agent[104423]:    log         /dev/log local0 debug
Dec  5 08:14:24 np0005546954 ovn_metadata_agent[104423]:    log-tag     haproxy-metadata-proxy-dbffd75b-eadb-49d2-adf6-6e9886260fa0
Dec  5 08:14:24 np0005546954 ovn_metadata_agent[104423]:    user        root
Dec  5 08:14:24 np0005546954 ovn_metadata_agent[104423]:    group       root
Dec  5 08:14:24 np0005546954 ovn_metadata_agent[104423]:    maxconn     1024
Dec  5 08:14:24 np0005546954 ovn_metadata_agent[104423]:    pidfile     /var/lib/neutron/external/pids/dbffd75b-eadb-49d2-adf6-6e9886260fa0.pid.haproxy
Dec  5 08:14:24 np0005546954 ovn_metadata_agent[104423]:    daemon
Dec  5 08:14:24 np0005546954 ovn_metadata_agent[104423]: 
Dec  5 08:14:24 np0005546954 ovn_metadata_agent[104423]: defaults
Dec  5 08:14:24 np0005546954 ovn_metadata_agent[104423]:    log global
Dec  5 08:14:24 np0005546954 ovn_metadata_agent[104423]:    mode http
Dec  5 08:14:24 np0005546954 ovn_metadata_agent[104423]:    option httplog
Dec  5 08:14:24 np0005546954 ovn_metadata_agent[104423]:    option dontlognull
Dec  5 08:14:24 np0005546954 ovn_metadata_agent[104423]:    option http-server-close
Dec  5 08:14:24 np0005546954 ovn_metadata_agent[104423]:    option forwardfor
Dec  5 08:14:24 np0005546954 ovn_metadata_agent[104423]:    retries                 3
Dec  5 08:14:24 np0005546954 ovn_metadata_agent[104423]:    timeout http-request    30s
Dec  5 08:14:24 np0005546954 ovn_metadata_agent[104423]:    timeout connect         30s
Dec  5 08:14:24 np0005546954 ovn_metadata_agent[104423]:    timeout client          32s
Dec  5 08:14:24 np0005546954 ovn_metadata_agent[104423]:    timeout server          32s
Dec  5 08:14:24 np0005546954 ovn_metadata_agent[104423]:    timeout http-keep-alive 30s
Dec  5 08:14:24 np0005546954 ovn_metadata_agent[104423]: 
Dec  5 08:14:24 np0005546954 ovn_metadata_agent[104423]: 
Dec  5 08:14:24 np0005546954 ovn_metadata_agent[104423]: listen listener
Dec  5 08:14:24 np0005546954 ovn_metadata_agent[104423]:    bind 169.254.169.254:80
Dec  5 08:14:24 np0005546954 ovn_metadata_agent[104423]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 08:14:24 np0005546954 ovn_metadata_agent[104423]:    http-request add-header X-OVN-Network-ID dbffd75b-eadb-49d2-adf6-6e9886260fa0
Dec  5 08:14:24 np0005546954 ovn_metadata_agent[104423]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 08:14:24 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:24.001 104428 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-dbffd75b-eadb-49d2-adf6-6e9886260fa0', 'env', 'PROCESS_TAG=haproxy-dbffd75b-eadb-49d2-adf6-6e9886260fa0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/dbffd75b-eadb-49d2-adf6-6e9886260fa0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 08:14:24 np0005546954 nova_compute[187160]: 2025-12-05 13:14:24.008 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:14:24 np0005546954 podman[220241]: 2025-12-05 13:14:24.351199991 +0000 UTC m=+0.051936604 container create 6f9f8e43e4d0f1b4064208d3466d7584a93379c5cfabc33af37f073c554b06d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dbffd75b-eadb-49d2-adf6-6e9886260fa0, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 08:14:24 np0005546954 systemd[1]: Started libpod-conmon-6f9f8e43e4d0f1b4064208d3466d7584a93379c5cfabc33af37f073c554b06d1.scope.
Dec  5 08:14:24 np0005546954 systemd[1]: Started libcrun container.
Dec  5 08:14:24 np0005546954 podman[220241]: 2025-12-05 13:14:24.324377058 +0000 UTC m=+0.025113701 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 08:14:24 np0005546954 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e4b097116bc833cda6c47521f9efa14dc362ab4090f26ccfc1ac57a84650aa4b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 08:14:24 np0005546954 podman[220241]: 2025-12-05 13:14:24.430946889 +0000 UTC m=+0.131683512 container init 6f9f8e43e4d0f1b4064208d3466d7584a93379c5cfabc33af37f073c554b06d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dbffd75b-eadb-49d2-adf6-6e9886260fa0, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Dec  5 08:14:24 np0005546954 podman[220241]: 2025-12-05 13:14:24.436877703 +0000 UTC m=+0.137614306 container start 6f9f8e43e4d0f1b4064208d3466d7584a93379c5cfabc33af37f073c554b06d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dbffd75b-eadb-49d2-adf6-6e9886260fa0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  5 08:14:24 np0005546954 neutron-haproxy-ovnmeta-dbffd75b-eadb-49d2-adf6-6e9886260fa0[220257]: [NOTICE]   (220261) : New worker (220263) forked
Dec  5 08:14:24 np0005546954 neutron-haproxy-ovnmeta-dbffd75b-eadb-49d2-adf6-6e9886260fa0[220257]: [NOTICE]   (220261) : Loading success.
Dec  5 08:14:24 np0005546954 nova_compute[187160]: 2025-12-05 13:14:24.583 187164 INFO nova.compute.manager [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba] Post operation of migration started#033[00m
Dec  5 08:14:24 np0005546954 nova_compute[187160]: 2025-12-05 13:14:24.870 187164 DEBUG oslo_concurrency.lockutils [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "refresh_cache-5e06965f-7c85-4c67-a69b-fd6a4d9c46ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 08:14:24 np0005546954 nova_compute[187160]: 2025-12-05 13:14:24.871 187164 DEBUG oslo_concurrency.lockutils [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquired lock "refresh_cache-5e06965f-7c85-4c67-a69b-fd6a4d9c46ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 08:14:24 np0005546954 nova_compute[187160]: 2025-12-05 13:14:24.872 187164 DEBUG nova.network.neutron [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 08:14:24 np0005546954 nova_compute[187160]: 2025-12-05 13:14:24.989 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:14:25 np0005546954 nova_compute[187160]: 2025-12-05 13:14:25.034 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:14:25 np0005546954 nova_compute[187160]: 2025-12-05 13:14:25.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:14:26 np0005546954 nova_compute[187160]: 2025-12-05 13:14:26.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:14:26 np0005546954 nova_compute[187160]: 2025-12-05 13:14:26.157 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:14:26 np0005546954 nova_compute[187160]: 2025-12-05 13:14:26.157 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:14:26 np0005546954 nova_compute[187160]: 2025-12-05 13:14:26.157 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:14:26 np0005546954 nova_compute[187160]: 2025-12-05 13:14:26.158 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 08:14:26 np0005546954 nova_compute[187160]: 2025-12-05 13:14:26.227 187164 DEBUG nova.network.neutron [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba] Updating instance_info_cache with network_info: [{"id": "cc69b7d6-8606-4956-8875-fc8b22e057f9", "address": "fa:16:3e:cb:2e:ac", "network": {"id": "dbffd75b-eadb-49d2-adf6-6e9886260fa0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1524978476-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9c39db0d5634bb28b51abbe0e5a2c82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc69b7d6-86", "ovs_interfaceid": "cc69b7d6-8606-4956-8875-fc8b22e057f9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 08:14:26 np0005546954 nova_compute[187160]: 2025-12-05 13:14:26.502 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5e06965f-7c85-4c67-a69b-fd6a4d9c46ba/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:14:26 np0005546954 nova_compute[187160]: 2025-12-05 13:14:26.581 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5e06965f-7c85-4c67-a69b-fd6a4d9c46ba/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:14:26 np0005546954 nova_compute[187160]: 2025-12-05 13:14:26.582 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5e06965f-7c85-4c67-a69b-fd6a4d9c46ba/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:14:26 np0005546954 nova_compute[187160]: 2025-12-05 13:14:26.607 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:14:26 np0005546954 nova_compute[187160]: 2025-12-05 13:14:26.620 187164 DEBUG oslo_concurrency.lockutils [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Releasing lock "refresh_cache-5e06965f-7c85-4c67-a69b-fd6a4d9c46ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 08:14:26 np0005546954 nova_compute[187160]: 2025-12-05 13:14:26.636 187164 DEBUG oslo_concurrency.processutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5e06965f-7c85-4c67-a69b-fd6a4d9c46ba/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:14:26 np0005546954 nova_compute[187160]: 2025-12-05 13:14:26.646 187164 DEBUG oslo_concurrency.lockutils [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:14:26 np0005546954 nova_compute[187160]: 2025-12-05 13:14:26.646 187164 DEBUG oslo_concurrency.lockutils [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:14:26 np0005546954 nova_compute[187160]: 2025-12-05 13:14:26.647 187164 DEBUG oslo_concurrency.lockutils [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:14:26 np0005546954 nova_compute[187160]: 2025-12-05 13:14:26.650 187164 INFO nova.virt.libvirt.driver [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Dec  5 08:14:26 np0005546954 virtqemud[186730]: Domain id=26 name='instance-0000001d' uuid=5e06965f-7c85-4c67-a69b-fd6a4d9c46ba is tainted: custom-monitor
Dec  5 08:14:26 np0005546954 nova_compute[187160]: 2025-12-05 13:14:26.804 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 08:14:26 np0005546954 nova_compute[187160]: 2025-12-05 13:14:26.807 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5666MB free_disk=73.30020904541016GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 08:14:26 np0005546954 nova_compute[187160]: 2025-12-05 13:14:26.807 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:14:26 np0005546954 nova_compute[187160]: 2025-12-05 13:14:26.809 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:14:26 np0005546954 nova_compute[187160]: 2025-12-05 13:14:26.871 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Migration for instance 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Dec  5 08:14:26 np0005546954 nova_compute[187160]: 2025-12-05 13:14:26.909 187164 INFO nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba] Updating resource usage from migration 51b6875b-b40c-4a85-b41e-c22bd5a31dc5#033[00m
Dec  5 08:14:26 np0005546954 nova_compute[187160]: 2025-12-05 13:14:26.909 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] [instance: 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba] Starting to track incoming migration 51b6875b-b40c-4a85-b41e-c22bd5a31dc5 with flavor b4ea63be-97f8-4a48-b000-66321c4ddb27 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Dec  5 08:14:26 np0005546954 nova_compute[187160]: 2025-12-05 13:14:26.972 187164 WARNING nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Instance 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.#033[00m
Dec  5 08:14:26 np0005546954 nova_compute[187160]: 2025-12-05 13:14:26.973 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 08:14:26 np0005546954 nova_compute[187160]: 2025-12-05 13:14:26.973 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 08:14:26 np0005546954 nova_compute[187160]: 2025-12-05 13:14:26.987 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Refreshing inventories for resource provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  5 08:14:27 np0005546954 nova_compute[187160]: 2025-12-05 13:14:27.009 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Updating ProviderTree inventory for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  5 08:14:27 np0005546954 nova_compute[187160]: 2025-12-05 13:14:27.010 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Updating inventory in ProviderTree for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  5 08:14:27 np0005546954 nova_compute[187160]: 2025-12-05 13:14:27.025 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Refreshing aggregate associations for resource provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  5 08:14:27 np0005546954 nova_compute[187160]: 2025-12-05 13:14:27.048 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Refreshing trait associations for resource provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b, traits: COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_IDE,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_2_0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  5 08:14:27 np0005546954 nova_compute[187160]: 2025-12-05 13:14:27.088 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 08:14:27 np0005546954 nova_compute[187160]: 2025-12-05 13:14:27.106 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 08:14:27 np0005546954 nova_compute[187160]: 2025-12-05 13:14:27.144 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 08:14:27 np0005546954 nova_compute[187160]: 2025-12-05 13:14:27.144 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.336s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:14:27 np0005546954 nova_compute[187160]: 2025-12-05 13:14:27.657 187164 INFO nova.virt.libvirt.driver [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Dec  5 08:14:28 np0005546954 nova_compute[187160]: 2025-12-05 13:14:28.664 187164 INFO nova.virt.libvirt.driver [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Dec  5 08:14:28 np0005546954 nova_compute[187160]: 2025-12-05 13:14:28.670 187164 DEBUG nova.compute.manager [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 08:14:29 np0005546954 nova_compute[187160]: 2025-12-05 13:14:29.285 187164 DEBUG nova.objects.instance [None req-e6dc1518-5430-4be0-bf0f-4ebd6389fbb8 49fa64bf1ca64b12ae07f314eedd348a 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Dec  5 08:14:29 np0005546954 nova_compute[187160]: 2025-12-05 13:14:29.992 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:14:30 np0005546954 nova_compute[187160]: 2025-12-05 13:14:30.151 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:14:30 np0005546954 nova_compute[187160]: 2025-12-05 13:14:30.152 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 08:14:30 np0005546954 podman[220279]: 2025-12-05 13:14:30.547037761 +0000 UTC m=+0.058237510 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  5 08:14:31 np0005546954 nova_compute[187160]: 2025-12-05 13:14:31.642 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:14:34 np0005546954 nova_compute[187160]: 2025-12-05 13:14:34.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:14:34 np0005546954 nova_compute[187160]: 2025-12-05 13:14:34.994 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:14:35 np0005546954 podman[197513]: time="2025-12-05T13:14:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:14:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:14:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  5 08:14:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:14:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3060 "" "Go-http-client/1.1"
Dec  5 08:14:36 np0005546954 nova_compute[187160]: 2025-12-05 13:14:36.643 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:14:37 np0005546954 podman[220304]: 2025-12-05 13:14:37.528124343 +0000 UTC m=+0.043854494 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec  5 08:14:37 np0005546954 podman[220303]: 2025-12-05 13:14:37.555916836 +0000 UTC m=+0.071960567 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  5 08:14:37 np0005546954 nova_compute[187160]: 2025-12-05 13:14:37.865 187164 DEBUG oslo_concurrency.lockutils [None req-1db43f8e-2780-42a7-8ee7-c47722bf4194 db8a6eb3223545318fef56cd23348ce4 d9c39db0d5634bb28b51abbe0e5a2c82 - - default default] Acquiring lock "5e06965f-7c85-4c67-a69b-fd6a4d9c46ba" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:14:37 np0005546954 nova_compute[187160]: 2025-12-05 13:14:37.866 187164 DEBUG oslo_concurrency.lockutils [None req-1db43f8e-2780-42a7-8ee7-c47722bf4194 db8a6eb3223545318fef56cd23348ce4 d9c39db0d5634bb28b51abbe0e5a2c82 - - default default] Lock "5e06965f-7c85-4c67-a69b-fd6a4d9c46ba" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:14:37 np0005546954 nova_compute[187160]: 2025-12-05 13:14:37.866 187164 DEBUG oslo_concurrency.lockutils [None req-1db43f8e-2780-42a7-8ee7-c47722bf4194 db8a6eb3223545318fef56cd23348ce4 d9c39db0d5634bb28b51abbe0e5a2c82 - - default default] Acquiring lock "5e06965f-7c85-4c67-a69b-fd6a4d9c46ba-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:14:37 np0005546954 nova_compute[187160]: 2025-12-05 13:14:37.866 187164 DEBUG oslo_concurrency.lockutils [None req-1db43f8e-2780-42a7-8ee7-c47722bf4194 db8a6eb3223545318fef56cd23348ce4 d9c39db0d5634bb28b51abbe0e5a2c82 - - default default] Lock "5e06965f-7c85-4c67-a69b-fd6a4d9c46ba-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:14:37 np0005546954 nova_compute[187160]: 2025-12-05 13:14:37.866 187164 DEBUG oslo_concurrency.lockutils [None req-1db43f8e-2780-42a7-8ee7-c47722bf4194 db8a6eb3223545318fef56cd23348ce4 d9c39db0d5634bb28b51abbe0e5a2c82 - - default default] Lock "5e06965f-7c85-4c67-a69b-fd6a4d9c46ba-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:14:37 np0005546954 nova_compute[187160]: 2025-12-05 13:14:37.867 187164 INFO nova.compute.manager [None req-1db43f8e-2780-42a7-8ee7-c47722bf4194 db8a6eb3223545318fef56cd23348ce4 d9c39db0d5634bb28b51abbe0e5a2c82 - - default default] [instance: 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba] Terminating instance#033[00m
Dec  5 08:14:37 np0005546954 nova_compute[187160]: 2025-12-05 13:14:37.868 187164 DEBUG nova.compute.manager [None req-1db43f8e-2780-42a7-8ee7-c47722bf4194 db8a6eb3223545318fef56cd23348ce4 d9c39db0d5634bb28b51abbe0e5a2c82 - - default default] [instance: 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 08:14:37 np0005546954 kernel: tapcc69b7d6-86 (unregistering): left promiscuous mode
Dec  5 08:14:37 np0005546954 NetworkManager[55665]: <info>  [1764940477.8970] device (tapcc69b7d6-86): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 08:14:37 np0005546954 ovn_controller[95566]: 2025-12-05T13:14:37Z|00272|binding|INFO|Releasing lport cc69b7d6-8606-4956-8875-fc8b22e057f9 from this chassis (sb_readonly=0)
Dec  5 08:14:37 np0005546954 ovn_controller[95566]: 2025-12-05T13:14:37Z|00273|binding|INFO|Setting lport cc69b7d6-8606-4956-8875-fc8b22e057f9 down in Southbound
Dec  5 08:14:37 np0005546954 nova_compute[187160]: 2025-12-05 13:14:37.912 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:14:37 np0005546954 ovn_controller[95566]: 2025-12-05T13:14:37Z|00274|binding|INFO|Removing iface tapcc69b7d6-86 ovn-installed in OVS
Dec  5 08:14:37 np0005546954 nova_compute[187160]: 2025-12-05 13:14:37.915 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:14:37 np0005546954 nova_compute[187160]: 2025-12-05 13:14:37.952 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:14:37 np0005546954 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d0000001d.scope: Deactivated successfully.
Dec  5 08:14:37 np0005546954 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d0000001d.scope: Consumed 1.604s CPU time.
Dec  5 08:14:37 np0005546954 systemd-machined[153497]: Machine qemu-26-instance-0000001d terminated.
Dec  5 08:14:38 np0005546954 nova_compute[187160]: 2025-12-05 13:14:38.140 187164 INFO nova.virt.libvirt.driver [-] [instance: 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba] Instance destroyed successfully.#033[00m
Dec  5 08:14:38 np0005546954 nova_compute[187160]: 2025-12-05 13:14:38.141 187164 DEBUG nova.objects.instance [None req-1db43f8e-2780-42a7-8ee7-c47722bf4194 db8a6eb3223545318fef56cd23348ce4 d9c39db0d5634bb28b51abbe0e5a2c82 - - default default] Lazy-loading 'resources' on Instance uuid 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 08:14:38 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:38.202 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cb:2e:ac 10.100.0.8'], port_security=['fa:16:3e:cb:2e:ac 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '5e06965f-7c85-4c67-a69b-fd6a4d9c46ba', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dbffd75b-eadb-49d2-adf6-6e9886260fa0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd9c39db0d5634bb28b51abbe0e5a2c82', 'neutron:revision_number': '12', 'neutron:security_group_ids': 'bc9e66f3-cdee-4f48-b699-4de7d6b9137f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9963a728-1bae-45e9-828f-319602e2a857, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>], logical_port=cc69b7d6-8606-4956-8875-fc8b22e057f9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd854341b20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 08:14:38 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:38.205 104428 INFO neutron.agent.ovn.metadata.agent [-] Port cc69b7d6-8606-4956-8875-fc8b22e057f9 in datapath dbffd75b-eadb-49d2-adf6-6e9886260fa0 unbound from our chassis#033[00m
Dec  5 08:14:38 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:38.207 104428 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dbffd75b-eadb-49d2-adf6-6e9886260fa0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 08:14:38 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:38.209 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[33e767ec-92aa-4adc-87df-63c858f7f737]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:14:38 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:38.210 104428 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-dbffd75b-eadb-49d2-adf6-6e9886260fa0 namespace which is not needed anymore#033[00m
Dec  5 08:14:38 np0005546954 nova_compute[187160]: 2025-12-05 13:14:38.214 187164 DEBUG nova.virt.libvirt.vif [None req-1db43f8e-2780-42a7-8ee7-c47722bf4194 db8a6eb3223545318fef56cd23348ce4 d9c39db0d5634bb28b51abbe0e5a2c82 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T13:13:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-347415860',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-347415860',id=29,image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T13:13:30Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d9c39db0d5634bb28b51abbe0e5a2c82',ramdisk_id='',reservation_id='r-lzku1gyg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='f4c3125a-6fd0-40bb-aa00-a7e736ee853d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-527033115',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-527033115-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T13:14:29Z,user_data=None,user_id='db8a6eb3223545318fef56cd23348ce4',uuid=5e06965f-7c85-4c67-a69b-fd6a4d9c46ba,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cc69b7d6-8606-4956-8875-fc8b22e057f9", "address": "fa:16:3e:cb:2e:ac", "network": {"id": "dbffd75b-eadb-49d2-adf6-6e9886260fa0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1524978476-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9c39db0d5634bb28b51abbe0e5a2c82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc69b7d6-86", "ovs_interfaceid": "cc69b7d6-8606-4956-8875-fc8b22e057f9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 08:14:38 np0005546954 nova_compute[187160]: 2025-12-05 13:14:38.215 187164 DEBUG nova.network.os_vif_util [None req-1db43f8e-2780-42a7-8ee7-c47722bf4194 db8a6eb3223545318fef56cd23348ce4 d9c39db0d5634bb28b51abbe0e5a2c82 - - default default] Converting VIF {"id": "cc69b7d6-8606-4956-8875-fc8b22e057f9", "address": "fa:16:3e:cb:2e:ac", "network": {"id": "dbffd75b-eadb-49d2-adf6-6e9886260fa0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1524978476-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9c39db0d5634bb28b51abbe0e5a2c82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc69b7d6-86", "ovs_interfaceid": "cc69b7d6-8606-4956-8875-fc8b22e057f9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 08:14:38 np0005546954 nova_compute[187160]: 2025-12-05 13:14:38.216 187164 DEBUG nova.network.os_vif_util [None req-1db43f8e-2780-42a7-8ee7-c47722bf4194 db8a6eb3223545318fef56cd23348ce4 d9c39db0d5634bb28b51abbe0e5a2c82 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:cb:2e:ac,bridge_name='br-int',has_traffic_filtering=True,id=cc69b7d6-8606-4956-8875-fc8b22e057f9,network=Network(dbffd75b-eadb-49d2-adf6-6e9886260fa0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc69b7d6-86') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 08:14:38 np0005546954 nova_compute[187160]: 2025-12-05 13:14:38.216 187164 DEBUG os_vif [None req-1db43f8e-2780-42a7-8ee7-c47722bf4194 db8a6eb3223545318fef56cd23348ce4 d9c39db0d5634bb28b51abbe0e5a2c82 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:cb:2e:ac,bridge_name='br-int',has_traffic_filtering=True,id=cc69b7d6-8606-4956-8875-fc8b22e057f9,network=Network(dbffd75b-eadb-49d2-adf6-6e9886260fa0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc69b7d6-86') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 08:14:38 np0005546954 nova_compute[187160]: 2025-12-05 13:14:38.220 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:14:38 np0005546954 nova_compute[187160]: 2025-12-05 13:14:38.220 187164 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcc69b7d6-86, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:14:38 np0005546954 nova_compute[187160]: 2025-12-05 13:14:38.222 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:14:38 np0005546954 nova_compute[187160]: 2025-12-05 13:14:38.224 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:14:38 np0005546954 nova_compute[187160]: 2025-12-05 13:14:38.228 187164 INFO os_vif [None req-1db43f8e-2780-42a7-8ee7-c47722bf4194 db8a6eb3223545318fef56cd23348ce4 d9c39db0d5634bb28b51abbe0e5a2c82 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:cb:2e:ac,bridge_name='br-int',has_traffic_filtering=True,id=cc69b7d6-8606-4956-8875-fc8b22e057f9,network=Network(dbffd75b-eadb-49d2-adf6-6e9886260fa0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc69b7d6-86')#033[00m
Dec  5 08:14:38 np0005546954 nova_compute[187160]: 2025-12-05 13:14:38.229 187164 INFO nova.virt.libvirt.driver [None req-1db43f8e-2780-42a7-8ee7-c47722bf4194 db8a6eb3223545318fef56cd23348ce4 d9c39db0d5634bb28b51abbe0e5a2c82 - - default default] [instance: 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba] Deleting instance files /var/lib/nova/instances/5e06965f-7c85-4c67-a69b-fd6a4d9c46ba_del#033[00m
Dec  5 08:14:38 np0005546954 nova_compute[187160]: 2025-12-05 13:14:38.230 187164 INFO nova.virt.libvirt.driver [None req-1db43f8e-2780-42a7-8ee7-c47722bf4194 db8a6eb3223545318fef56cd23348ce4 d9c39db0d5634bb28b51abbe0e5a2c82 - - default default] [instance: 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba] Deletion of /var/lib/nova/instances/5e06965f-7c85-4c67-a69b-fd6a4d9c46ba_del complete#033[00m
Dec  5 08:14:38 np0005546954 nova_compute[187160]: 2025-12-05 13:14:38.338 187164 INFO nova.compute.manager [None req-1db43f8e-2780-42a7-8ee7-c47722bf4194 db8a6eb3223545318fef56cd23348ce4 d9c39db0d5634bb28b51abbe0e5a2c82 - - default default] [instance: 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba] Took 0.47 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 08:14:38 np0005546954 nova_compute[187160]: 2025-12-05 13:14:38.338 187164 DEBUG oslo.service.loopingcall [None req-1db43f8e-2780-42a7-8ee7-c47722bf4194 db8a6eb3223545318fef56cd23348ce4 d9c39db0d5634bb28b51abbe0e5a2c82 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 08:14:38 np0005546954 nova_compute[187160]: 2025-12-05 13:14:38.339 187164 DEBUG nova.compute.manager [-] [instance: 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 08:14:38 np0005546954 nova_compute[187160]: 2025-12-05 13:14:38.339 187164 DEBUG nova.network.neutron [-] [instance: 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 08:14:38 np0005546954 neutron-haproxy-ovnmeta-dbffd75b-eadb-49d2-adf6-6e9886260fa0[220257]: [NOTICE]   (220261) : haproxy version is 2.8.14-c23fe91
Dec  5 08:14:38 np0005546954 neutron-haproxy-ovnmeta-dbffd75b-eadb-49d2-adf6-6e9886260fa0[220257]: [NOTICE]   (220261) : path to executable is /usr/sbin/haproxy
Dec  5 08:14:38 np0005546954 neutron-haproxy-ovnmeta-dbffd75b-eadb-49d2-adf6-6e9886260fa0[220257]: [ALERT]    (220261) : Current worker (220263) exited with code 143 (Terminated)
Dec  5 08:14:38 np0005546954 neutron-haproxy-ovnmeta-dbffd75b-eadb-49d2-adf6-6e9886260fa0[220257]: [WARNING]  (220261) : All workers exited. Exiting... (0)
Dec  5 08:14:38 np0005546954 systemd[1]: libpod-6f9f8e43e4d0f1b4064208d3466d7584a93379c5cfabc33af37f073c554b06d1.scope: Deactivated successfully.
Dec  5 08:14:38 np0005546954 conmon[220257]: conmon 6f9f8e43e4d0f1b40642 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-6f9f8e43e4d0f1b4064208d3466d7584a93379c5cfabc33af37f073c554b06d1.scope/container/memory.events
Dec  5 08:14:38 np0005546954 podman[220397]: 2025-12-05 13:14:38.502042017 +0000 UTC m=+0.203084740 container died 6f9f8e43e4d0f1b4064208d3466d7584a93379c5cfabc33af37f073c554b06d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dbffd75b-eadb-49d2-adf6-6e9886260fa0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec  5 08:14:39 np0005546954 systemd[1]: var-lib-containers-storage-overlay-e4b097116bc833cda6c47521f9efa14dc362ab4090f26ccfc1ac57a84650aa4b-merged.mount: Deactivated successfully.
Dec  5 08:14:39 np0005546954 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6f9f8e43e4d0f1b4064208d3466d7584a93379c5cfabc33af37f073c554b06d1-userdata-shm.mount: Deactivated successfully.
Dec  5 08:14:39 np0005546954 podman[220397]: 2025-12-05 13:14:39.111307633 +0000 UTC m=+0.812350346 container cleanup 6f9f8e43e4d0f1b4064208d3466d7584a93379c5cfabc33af37f073c554b06d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dbffd75b-eadb-49d2-adf6-6e9886260fa0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  5 08:14:39 np0005546954 systemd[1]: libpod-conmon-6f9f8e43e4d0f1b4064208d3466d7584a93379c5cfabc33af37f073c554b06d1.scope: Deactivated successfully.
Dec  5 08:14:39 np0005546954 podman[220426]: 2025-12-05 13:14:39.204914201 +0000 UTC m=+0.058022854 container remove 6f9f8e43e4d0f1b4064208d3466d7584a93379c5cfabc33af37f073c554b06d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dbffd75b-eadb-49d2-adf6-6e9886260fa0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  5 08:14:39 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:39.212 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[d3a96468-3adb-4f4e-b021-cc28fa25fb94]: (4, ('Fri Dec  5 01:14:38 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-dbffd75b-eadb-49d2-adf6-6e9886260fa0 (6f9f8e43e4d0f1b4064208d3466d7584a93379c5cfabc33af37f073c554b06d1)\n6f9f8e43e4d0f1b4064208d3466d7584a93379c5cfabc33af37f073c554b06d1\nFri Dec  5 01:14:39 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-dbffd75b-eadb-49d2-adf6-6e9886260fa0 (6f9f8e43e4d0f1b4064208d3466d7584a93379c5cfabc33af37f073c554b06d1)\n6f9f8e43e4d0f1b4064208d3466d7584a93379c5cfabc33af37f073c554b06d1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:14:39 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:39.213 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[c2e73442-3c70-496f-9340-0c9034457997]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:14:39 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:39.214 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdbffd75b-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:14:39 np0005546954 nova_compute[187160]: 2025-12-05 13:14:39.251 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:14:39 np0005546954 kernel: tapdbffd75b-e0: left promiscuous mode
Dec  5 08:14:39 np0005546954 nova_compute[187160]: 2025-12-05 13:14:39.264 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:14:39 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:39.267 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[5a666619-f87a-4e7d-847b-895baf0b5865]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:14:39 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:39.288 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[229d3ab2-8c43-49e4-aa3b-4c5f4a3b74af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:14:39 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:39.289 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[a2439723-d3da-4ff3-8304-b484767e5104]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:14:39 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:39.314 208690 DEBUG oslo.privsep.daemon [-] privsep: reply[2c8ef69c-f0c2-4266-a7ba-0aebc0c110d4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 547036, 'reachable_time': 23425, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220442, 'error': None, 'target': 'ovnmeta-dbffd75b-eadb-49d2-adf6-6e9886260fa0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:14:39 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:39.316 104542 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-dbffd75b-eadb-49d2-adf6-6e9886260fa0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 08:14:39 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:39.316 104542 DEBUG oslo.privsep.daemon [-] privsep: reply[0298cb3e-3978-4eb7-b739-253bce739b95]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 08:14:39 np0005546954 systemd[1]: run-netns-ovnmeta\x2ddbffd75b\x2deadb\x2d49d2\x2dadf6\x2d6e9886260fa0.mount: Deactivated successfully.
Dec  5 08:14:39 np0005546954 nova_compute[187160]: 2025-12-05 13:14:39.386 187164 DEBUG nova.compute.manager [req-6c8796bd-5b2c-434d-a64e-6b3a3fae7228 req-396bf0e2-0ff7-4ed8-a736-da847ef63cf6 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba] Received event network-vif-unplugged-cc69b7d6-8606-4956-8875-fc8b22e057f9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:14:39 np0005546954 nova_compute[187160]: 2025-12-05 13:14:39.386 187164 DEBUG oslo_concurrency.lockutils [req-6c8796bd-5b2c-434d-a64e-6b3a3fae7228 req-396bf0e2-0ff7-4ed8-a736-da847ef63cf6 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "5e06965f-7c85-4c67-a69b-fd6a4d9c46ba-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:14:39 np0005546954 nova_compute[187160]: 2025-12-05 13:14:39.386 187164 DEBUG oslo_concurrency.lockutils [req-6c8796bd-5b2c-434d-a64e-6b3a3fae7228 req-396bf0e2-0ff7-4ed8-a736-da847ef63cf6 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "5e06965f-7c85-4c67-a69b-fd6a4d9c46ba-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:14:39 np0005546954 nova_compute[187160]: 2025-12-05 13:14:39.387 187164 DEBUG oslo_concurrency.lockutils [req-6c8796bd-5b2c-434d-a64e-6b3a3fae7228 req-396bf0e2-0ff7-4ed8-a736-da847ef63cf6 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "5e06965f-7c85-4c67-a69b-fd6a4d9c46ba-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:14:39 np0005546954 nova_compute[187160]: 2025-12-05 13:14:39.387 187164 DEBUG nova.compute.manager [req-6c8796bd-5b2c-434d-a64e-6b3a3fae7228 req-396bf0e2-0ff7-4ed8-a736-da847ef63cf6 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba] No waiting events found dispatching network-vif-unplugged-cc69b7d6-8606-4956-8875-fc8b22e057f9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 08:14:39 np0005546954 nova_compute[187160]: 2025-12-05 13:14:39.387 187164 DEBUG nova.compute.manager [req-6c8796bd-5b2c-434d-a64e-6b3a3fae7228 req-396bf0e2-0ff7-4ed8-a736-da847ef63cf6 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba] Received event network-vif-unplugged-cc69b7d6-8606-4956-8875-fc8b22e057f9 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  5 08:14:41 np0005546954 nova_compute[187160]: 2025-12-05 13:14:41.253 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:14:41 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:41.253 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2a:56:46', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '32:90:88:ab:74:32'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 08:14:41 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:41.255 104428 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 08:14:41 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:14:41.256 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f9f74c-08f9-451f-9678-93bb9e8fa2fe, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:14:41 np0005546954 nova_compute[187160]: 2025-12-05 13:14:41.290 187164 DEBUG nova.network.neutron [-] [instance: 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 08:14:41 np0005546954 nova_compute[187160]: 2025-12-05 13:14:41.380 187164 INFO nova.compute.manager [-] [instance: 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba] Took 3.04 seconds to deallocate network for instance.#033[00m
Dec  5 08:14:41 np0005546954 nova_compute[187160]: 2025-12-05 13:14:41.472 187164 DEBUG oslo_concurrency.lockutils [None req-1db43f8e-2780-42a7-8ee7-c47722bf4194 db8a6eb3223545318fef56cd23348ce4 d9c39db0d5634bb28b51abbe0e5a2c82 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:14:41 np0005546954 nova_compute[187160]: 2025-12-05 13:14:41.472 187164 DEBUG oslo_concurrency.lockutils [None req-1db43f8e-2780-42a7-8ee7-c47722bf4194 db8a6eb3223545318fef56cd23348ce4 d9c39db0d5634bb28b51abbe0e5a2c82 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:14:41 np0005546954 nova_compute[187160]: 2025-12-05 13:14:41.478 187164 DEBUG nova.compute.manager [req-06db2f22-af65-4574-9d2b-8046b9d470e3 req-ab059e07-db5b-48f9-8064-c26ba77c99d7 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba] Received event network-vif-plugged-cc69b7d6-8606-4956-8875-fc8b22e057f9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:14:41 np0005546954 nova_compute[187160]: 2025-12-05 13:14:41.479 187164 DEBUG oslo_concurrency.lockutils [req-06db2f22-af65-4574-9d2b-8046b9d470e3 req-ab059e07-db5b-48f9-8064-c26ba77c99d7 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Acquiring lock "5e06965f-7c85-4c67-a69b-fd6a4d9c46ba-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:14:41 np0005546954 nova_compute[187160]: 2025-12-05 13:14:41.479 187164 DEBUG oslo_concurrency.lockutils [req-06db2f22-af65-4574-9d2b-8046b9d470e3 req-ab059e07-db5b-48f9-8064-c26ba77c99d7 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "5e06965f-7c85-4c67-a69b-fd6a4d9c46ba-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:14:41 np0005546954 nova_compute[187160]: 2025-12-05 13:14:41.479 187164 DEBUG oslo_concurrency.lockutils [req-06db2f22-af65-4574-9d2b-8046b9d470e3 req-ab059e07-db5b-48f9-8064-c26ba77c99d7 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] Lock "5e06965f-7c85-4c67-a69b-fd6a4d9c46ba-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:14:41 np0005546954 nova_compute[187160]: 2025-12-05 13:14:41.479 187164 DEBUG nova.compute.manager [req-06db2f22-af65-4574-9d2b-8046b9d470e3 req-ab059e07-db5b-48f9-8064-c26ba77c99d7 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba] No waiting events found dispatching network-vif-plugged-cc69b7d6-8606-4956-8875-fc8b22e057f9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 08:14:41 np0005546954 nova_compute[187160]: 2025-12-05 13:14:41.480 187164 WARNING nova.compute.manager [req-06db2f22-af65-4574-9d2b-8046b9d470e3 req-ab059e07-db5b-48f9-8064-c26ba77c99d7 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba] Received unexpected event network-vif-plugged-cc69b7d6-8606-4956-8875-fc8b22e057f9 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 08:14:41 np0005546954 nova_compute[187160]: 2025-12-05 13:14:41.480 187164 DEBUG nova.compute.manager [req-06db2f22-af65-4574-9d2b-8046b9d470e3 req-ab059e07-db5b-48f9-8064-c26ba77c99d7 7c4adbb6dc784941bd9d4d7790043191 35b3c315c7fe4bc19f6db78df673cb8a - - default default] [instance: 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba] Received event network-vif-deleted-cc69b7d6-8606-4956-8875-fc8b22e057f9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 08:14:41 np0005546954 nova_compute[187160]: 2025-12-05 13:14:41.482 187164 DEBUG oslo_concurrency.lockutils [None req-1db43f8e-2780-42a7-8ee7-c47722bf4194 db8a6eb3223545318fef56cd23348ce4 d9c39db0d5634bb28b51abbe0e5a2c82 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.010s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:14:41 np0005546954 nova_compute[187160]: 2025-12-05 13:14:41.550 187164 INFO nova.scheduler.client.report [None req-1db43f8e-2780-42a7-8ee7-c47722bf4194 db8a6eb3223545318fef56cd23348ce4 d9c39db0d5634bb28b51abbe0e5a2c82 - - default default] Deleted allocations for instance 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba#033[00m
Dec  5 08:14:41 np0005546954 nova_compute[187160]: 2025-12-05 13:14:41.645 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:14:41 np0005546954 nova_compute[187160]: 2025-12-05 13:14:41.840 187164 DEBUG oslo_concurrency.lockutils [None req-1db43f8e-2780-42a7-8ee7-c47722bf4194 db8a6eb3223545318fef56cd23348ce4 d9c39db0d5634bb28b51abbe0e5a2c82 - - default default] Lock "5e06965f-7c85-4c67-a69b-fd6a4d9c46ba" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.974s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:14:43 np0005546954 nova_compute[187160]: 2025-12-05 13:14:43.224 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:14:46 np0005546954 nova_compute[187160]: 2025-12-05 13:14:46.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:14:46 np0005546954 nova_compute[187160]: 2025-12-05 13:14:46.040 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:14:46 np0005546954 nova_compute[187160]: 2025-12-05 13:14:46.041 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:14:46 np0005546954 nova_compute[187160]: 2025-12-05 13:14:46.041 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:14:46 np0005546954 nova_compute[187160]: 2025-12-05 13:14:46.041 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:14:46 np0005546954 nova_compute[187160]: 2025-12-05 13:14:46.042 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:14:46 np0005546954 nova_compute[187160]: 2025-12-05 13:14:46.042 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:14:46 np0005546954 nova_compute[187160]: 2025-12-05 13:14:46.088 187164 DEBUG nova.virt.libvirt.imagecache [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314#033[00m
Dec  5 08:14:46 np0005546954 nova_compute[187160]: 2025-12-05 13:14:46.089 187164 WARNING nova.virt.libvirt.imagecache [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4#033[00m
Dec  5 08:14:46 np0005546954 nova_compute[187160]: 2025-12-05 13:14:46.089 187164 INFO nova.virt.libvirt.imagecache [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Removable base files: /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4#033[00m
Dec  5 08:14:46 np0005546954 nova_compute[187160]: 2025-12-05 13:14:46.090 187164 INFO nova.virt.libvirt.imagecache [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/c9b7a6045cd59d7d8edb33eed052a432c9b974a4#033[00m
Dec  5 08:14:46 np0005546954 nova_compute[187160]: 2025-12-05 13:14:46.090 187164 DEBUG nova.virt.libvirt.imagecache [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350#033[00m
Dec  5 08:14:46 np0005546954 nova_compute[187160]: 2025-12-05 13:14:46.090 187164 DEBUG nova.virt.libvirt.imagecache [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299#033[00m
Dec  5 08:14:46 np0005546954 nova_compute[187160]: 2025-12-05 13:14:46.090 187164 DEBUG nova.virt.libvirt.imagecache [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284#033[00m
Dec  5 08:14:46 np0005546954 nova_compute[187160]: 2025-12-05 13:14:46.648 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:14:48 np0005546954 nova_compute[187160]: 2025-12-05 13:14:48.226 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:14:48 np0005546954 podman[220444]: 2025-12-05 13:14:48.590884907 +0000 UTC m=+0.086140717 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, name=ubi9-minimal, container_name=openstack_network_exporter, version=9.6, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, distribution-scope=public, maintainer=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, vcs-type=git)
Dec  5 08:14:48 np0005546954 podman[220445]: 2025-12-05 13:14:48.599872427 +0000 UTC m=+0.091618297 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd)
Dec  5 08:14:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:14:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:14:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:14:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:14:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:14:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:14:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:14:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:14:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:14:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:14:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:14:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:14:51 np0005546954 nova_compute[187160]: 2025-12-05 13:14:51.648 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:14:53 np0005546954 nova_compute[187160]: 2025-12-05 13:14:53.140 187164 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764940478.1388156, 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 08:14:53 np0005546954 nova_compute[187160]: 2025-12-05 13:14:53.141 187164 INFO nova.compute.manager [-] [instance: 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba] VM Stopped (Lifecycle Event)#033[00m
Dec  5 08:14:53 np0005546954 nova_compute[187160]: 2025-12-05 13:14:53.170 187164 DEBUG nova.compute.manager [None req-1a4b988c-3ab0-4930-ad7c-4f08e78bd85e - - - - - -] [instance: 5e06965f-7c85-4c67-a69b-fd6a4d9c46ba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 08:14:53 np0005546954 nova_compute[187160]: 2025-12-05 13:14:53.230 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:14:56 np0005546954 nova_compute[187160]: 2025-12-05 13:14:56.650 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:14:58 np0005546954 nova_compute[187160]: 2025-12-05 13:14:58.232 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:15:01 np0005546954 podman[220482]: 2025-12-05 13:15:01.575085131 +0000 UTC m=+0.081758231 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Dec  5 08:15:01 np0005546954 nova_compute[187160]: 2025-12-05 13:15:01.651 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:15:03 np0005546954 nova_compute[187160]: 2025-12-05 13:15:03.236 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:15:05 np0005546954 podman[197513]: time="2025-12-05T13:15:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:15:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:15:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 08:15:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:15:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2595 "" "Go-http-client/1.1"
Dec  5 08:15:06 np0005546954 nova_compute[187160]: 2025-12-05 13:15:06.652 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:15:08 np0005546954 nova_compute[187160]: 2025-12-05 13:15:08.239 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:15:08 np0005546954 podman[220502]: 2025-12-05 13:15:08.577210007 +0000 UTC m=+0.078424587 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  5 08:15:08 np0005546954 podman[220501]: 2025-12-05 13:15:08.613562686 +0000 UTC m=+0.126799150 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 08:15:10 np0005546954 nova_compute[187160]: 2025-12-05 13:15:10.353 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:15:11 np0005546954 nova_compute[187160]: 2025-12-05 13:15:11.654 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:15:13 np0005546954 nova_compute[187160]: 2025-12-05 13:15:13.242 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:15:16 np0005546954 nova_compute[187160]: 2025-12-05 13:15:16.656 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:15:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:15:16.980 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:15:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:15:16.981 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:15:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:15:16.981 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:15:18 np0005546954 nova_compute[187160]: 2025-12-05 13:15:18.246 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:15:19 np0005546954 nova_compute[187160]: 2025-12-05 13:15:19.090 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:15:19 np0005546954 nova_compute[187160]: 2025-12-05 13:15:19.090 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 08:15:19 np0005546954 nova_compute[187160]: 2025-12-05 13:15:19.091 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 08:15:19 np0005546954 nova_compute[187160]: 2025-12-05 13:15:19.111 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 08:15:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:15:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:15:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:15:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:15:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:15:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:15:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:15:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:15:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:15:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:15:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:15:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:15:19 np0005546954 podman[220553]: 2025-12-05 13:15:19.588853825 +0000 UTC m=+0.088876873 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, release=1755695350, architecture=x86_64, config_id=edpm, version=9.6, build-date=2025-08-20T13:12:41)
Dec  5 08:15:19 np0005546954 podman[220554]: 2025-12-05 13:15:19.624038427 +0000 UTC m=+0.119171972 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=multipathd, container_name=multipathd)
Dec  5 08:15:21 np0005546954 nova_compute[187160]: 2025-12-05 13:15:21.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:15:21 np0005546954 nova_compute[187160]: 2025-12-05 13:15:21.659 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:15:22 np0005546954 nova_compute[187160]: 2025-12-05 13:15:22.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:15:23 np0005546954 nova_compute[187160]: 2025-12-05 13:15:23.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:15:23 np0005546954 nova_compute[187160]: 2025-12-05 13:15:23.248 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:15:25 np0005546954 nova_compute[187160]: 2025-12-05 13:15:25.034 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:15:26 np0005546954 nova_compute[187160]: 2025-12-05 13:15:26.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:15:26 np0005546954 nova_compute[187160]: 2025-12-05 13:15:26.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:15:26 np0005546954 nova_compute[187160]: 2025-12-05 13:15:26.038 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  5 08:15:26 np0005546954 nova_compute[187160]: 2025-12-05 13:15:26.060 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  5 08:15:26 np0005546954 nova_compute[187160]: 2025-12-05 13:15:26.662 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:15:27 np0005546954 nova_compute[187160]: 2025-12-05 13:15:27.061 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:15:27 np0005546954 nova_compute[187160]: 2025-12-05 13:15:27.103 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:15:27 np0005546954 nova_compute[187160]: 2025-12-05 13:15:27.103 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:15:27 np0005546954 nova_compute[187160]: 2025-12-05 13:15:27.104 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:15:27 np0005546954 nova_compute[187160]: 2025-12-05 13:15:27.104 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 08:15:27 np0005546954 nova_compute[187160]: 2025-12-05 13:15:27.288 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 08:15:27 np0005546954 nova_compute[187160]: 2025-12-05 13:15:27.289 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5883MB free_disk=73.32940673828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 08:15:27 np0005546954 nova_compute[187160]: 2025-12-05 13:15:27.289 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:15:27 np0005546954 nova_compute[187160]: 2025-12-05 13:15:27.289 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:15:27 np0005546954 nova_compute[187160]: 2025-12-05 13:15:27.523 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 08:15:27 np0005546954 nova_compute[187160]: 2025-12-05 13:15:27.523 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 08:15:27 np0005546954 nova_compute[187160]: 2025-12-05 13:15:27.607 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 08:15:27 np0005546954 nova_compute[187160]: 2025-12-05 13:15:27.623 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 08:15:27 np0005546954 nova_compute[187160]: 2025-12-05 13:15:27.645 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 08:15:27 np0005546954 nova_compute[187160]: 2025-12-05 13:15:27.646 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.357s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:15:28 np0005546954 nova_compute[187160]: 2025-12-05 13:15:28.251 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:15:31 np0005546954 nova_compute[187160]: 2025-12-05 13:15:31.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:15:31 np0005546954 nova_compute[187160]: 2025-12-05 13:15:31.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 08:15:31 np0005546954 nova_compute[187160]: 2025-12-05 13:15:31.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:15:31 np0005546954 nova_compute[187160]: 2025-12-05 13:15:31.663 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:15:32 np0005546954 podman[220591]: 2025-12-05 13:15:32.576942029 +0000 UTC m=+0.072519094 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec  5 08:15:33 np0005546954 nova_compute[187160]: 2025-12-05 13:15:33.254 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:15:34 np0005546954 nova_compute[187160]: 2025-12-05 13:15:34.057 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:15:34 np0005546954 nova_compute[187160]: 2025-12-05 13:15:34.058 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:15:34 np0005546954 nova_compute[187160]: 2025-12-05 13:15:34.058 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  5 08:15:35 np0005546954 podman[197513]: time="2025-12-05T13:15:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:15:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:15:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 08:15:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:15:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2600 "" "Go-http-client/1.1"
Dec  5 08:15:36 np0005546954 nova_compute[187160]: 2025-12-05 13:15:36.666 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:15:36 np0005546954 nova_compute[187160]: 2025-12-05 13:15:36.862 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:15:37 np0005546954 nova_compute[187160]: 2025-12-05 13:15:37.180 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:15:38 np0005546954 nova_compute[187160]: 2025-12-05 13:15:38.258 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:15:39 np0005546954 podman[220612]: 2025-12-05 13:15:39.551809077 +0000 UTC m=+0.057380334 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 08:15:39 np0005546954 podman[220611]: 2025-12-05 13:15:39.574617326 +0000 UTC m=+0.084207097 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS)
Dec  5 08:15:41 np0005546954 nova_compute[187160]: 2025-12-05 13:15:41.668 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:15:43 np0005546954 nova_compute[187160]: 2025-12-05 13:15:43.261 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:15:46 np0005546954 nova_compute[187160]: 2025-12-05 13:15:46.670 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:15:48 np0005546954 nova_compute[187160]: 2025-12-05 13:15:48.312 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:15:49 np0005546954 ovn_controller[95566]: 2025-12-05T13:15:49Z|00275|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Dec  5 08:15:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:15:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:15:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:15:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:15:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:15:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:15:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:15:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:15:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:15:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:15:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:15:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:15:50 np0005546954 podman[220663]: 2025-12-05 13:15:50.55881328 +0000 UTC m=+0.068486408 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=multipathd, org.label-schema.build-date=20251125)
Dec  5 08:15:50 np0005546954 podman[220662]: 2025-12-05 13:15:50.568638376 +0000 UTC m=+0.071856894 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, io.buildah.version=1.33.7, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec  5 08:15:51 np0005546954 nova_compute[187160]: 2025-12-05 13:15:51.709 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:15:53 np0005546954 nova_compute[187160]: 2025-12-05 13:15:53.314 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:15:56 np0005546954 nova_compute[187160]: 2025-12-05 13:15:56.712 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:15:58 np0005546954 nova_compute[187160]: 2025-12-05 13:15:58.317 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:16:01 np0005546954 nova_compute[187160]: 2025-12-05 13:16:01.713 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:16:03 np0005546954 nova_compute[187160]: 2025-12-05 13:16:03.320 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:16:03 np0005546954 podman[220703]: 2025-12-05 13:16:03.557534274 +0000 UTC m=+0.076542818 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  5 08:16:05 np0005546954 podman[197513]: time="2025-12-05T13:16:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:16:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:16:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 08:16:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:16:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2595 "" "Go-http-client/1.1"
Dec  5 08:16:06 np0005546954 nova_compute[187160]: 2025-12-05 13:16:06.715 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:16:08 np0005546954 nova_compute[187160]: 2025-12-05 13:16:08.323 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:16:10 np0005546954 podman[220728]: 2025-12-05 13:16:10.544270301 +0000 UTC m=+0.053900295 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  5 08:16:10 np0005546954 podman[220727]: 2025-12-05 13:16:10.599844108 +0000 UTC m=+0.103175096 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true)
Dec  5 08:16:10 np0005546954 systemd-logind[789]: New session 38 of user zuul.
Dec  5 08:16:10 np0005546954 systemd[1]: Started Session 38 of User zuul.
Dec  5 08:16:11 np0005546954 nova_compute[187160]: 2025-12-05 13:16:11.718 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:16:13 np0005546954 nova_compute[187160]: 2025-12-05 13:16:13.326 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:16:16 np0005546954 nova_compute[187160]: 2025-12-05 13:16:16.720 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:16:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:16:16.982 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:16:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:16:16.984 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:16:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:16:16.984 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:16:17 np0005546954 ovs-vsctl[220996]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Dec  5 08:16:18 np0005546954 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 220805 (sos)
Dec  5 08:16:18 np0005546954 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Dec  5 08:16:18 np0005546954 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Dec  5 08:16:18 np0005546954 nova_compute[187160]: 2025-12-05 13:16:18.329 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:16:18 np0005546954 virtqemud[186730]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Dec  5 08:16:18 np0005546954 virtqemud[186730]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Dec  5 08:16:18 np0005546954 virtqemud[186730]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec  5 08:16:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:16:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:16:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:16:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:16:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:16:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:16:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:16:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:16:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:16:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:16:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:16:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:16:21 np0005546954 nova_compute[187160]: 2025-12-05 13:16:21.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:16:21 np0005546954 nova_compute[187160]: 2025-12-05 13:16:21.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 08:16:21 np0005546954 nova_compute[187160]: 2025-12-05 13:16:21.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 08:16:21 np0005546954 nova_compute[187160]: 2025-12-05 13:16:21.073 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 08:16:21 np0005546954 podman[221493]: 2025-12-05 13:16:21.573151193 +0000 UTC m=+0.081797202 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec  5 08:16:21 np0005546954 podman[221492]: 2025-12-05 13:16:21.584118884 +0000 UTC m=+0.092668480 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, config_id=edpm, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41)
Dec  5 08:16:21 np0005546954 nova_compute[187160]: 2025-12-05 13:16:21.722 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:16:22 np0005546954 systemd[1]: Starting Hostname Service...
Dec  5 08:16:22 np0005546954 nova_compute[187160]: 2025-12-05 13:16:22.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:16:22 np0005546954 nova_compute[187160]: 2025-12-05 13:16:22.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:16:22 np0005546954 systemd[1]: Started Hostname Service.
Dec  5 08:16:22 np0005546954 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec  5 08:16:23 np0005546954 nova_compute[187160]: 2025-12-05 13:16:23.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:16:23 np0005546954 nova_compute[187160]: 2025-12-05 13:16:23.332 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:16:25 np0005546954 nova_compute[187160]: 2025-12-05 13:16:25.034 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:16:26 np0005546954 nova_compute[187160]: 2025-12-05 13:16:26.726 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:16:28 np0005546954 nova_compute[187160]: 2025-12-05 13:16:28.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:16:28 np0005546954 nova_compute[187160]: 2025-12-05 13:16:28.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:16:28 np0005546954 nova_compute[187160]: 2025-12-05 13:16:28.068 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:16:28 np0005546954 nova_compute[187160]: 2025-12-05 13:16:28.068 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:16:28 np0005546954 nova_compute[187160]: 2025-12-05 13:16:28.068 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:16:28 np0005546954 nova_compute[187160]: 2025-12-05 13:16:28.068 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 08:16:28 np0005546954 nova_compute[187160]: 2025-12-05 13:16:28.218 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 08:16:28 np0005546954 nova_compute[187160]: 2025-12-05 13:16:28.219 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5391MB free_disk=73.15169143676758GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 08:16:28 np0005546954 nova_compute[187160]: 2025-12-05 13:16:28.220 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:16:28 np0005546954 nova_compute[187160]: 2025-12-05 13:16:28.220 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:16:28 np0005546954 nova_compute[187160]: 2025-12-05 13:16:28.334 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:16:28 np0005546954 nova_compute[187160]: 2025-12-05 13:16:28.342 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 08:16:28 np0005546954 nova_compute[187160]: 2025-12-05 13:16:28.343 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 08:16:28 np0005546954 nova_compute[187160]: 2025-12-05 13:16:28.394 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 08:16:28 np0005546954 nova_compute[187160]: 2025-12-05 13:16:28.421 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 08:16:28 np0005546954 nova_compute[187160]: 2025-12-05 13:16:28.423 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 08:16:28 np0005546954 nova_compute[187160]: 2025-12-05 13:16:28.423 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.203s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:16:28 np0005546954 ovs-appctl[222586]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec  5 08:16:28 np0005546954 ovs-appctl[222593]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec  5 08:16:28 np0005546954 ovs-appctl[222605]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec  5 08:16:31 np0005546954 nova_compute[187160]: 2025-12-05 13:16:31.726 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:16:33 np0005546954 nova_compute[187160]: 2025-12-05 13:16:33.335 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:16:33 np0005546954 nova_compute[187160]: 2025-12-05 13:16:33.423 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:16:33 np0005546954 nova_compute[187160]: 2025-12-05 13:16:33.424 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 08:16:33 np0005546954 podman[223642]: 2025-12-05 13:16:33.955901681 +0000 UTC m=+0.094691812 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec  5 08:16:34 np0005546954 nova_compute[187160]: 2025-12-05 13:16:34.041 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:16:35 np0005546954 podman[197513]: time="2025-12-05T13:16:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:16:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:16:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 08:16:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:16:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2599 "" "Go-http-client/1.1"
Dec  5 08:16:35 np0005546954 virtqemud[186730]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec  5 08:16:36 np0005546954 nova_compute[187160]: 2025-12-05 13:16:36.728 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:16:37 np0005546954 systemd[1]: Starting Time & Date Service...
Dec  5 08:16:37 np0005546954 systemd[1]: Started Time & Date Service.
Dec  5 08:16:38 np0005546954 nova_compute[187160]: 2025-12-05 13:16:38.338 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:16:41 np0005546954 podman[224126]: 2025-12-05 13:16:41.554164165 +0000 UTC m=+0.067587711 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec  5 08:16:41 np0005546954 podman[224125]: 2025-12-05 13:16:41.603040112 +0000 UTC m=+0.106283142 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 08:16:41 np0005546954 nova_compute[187160]: 2025-12-05 13:16:41.731 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:16:43 np0005546954 nova_compute[187160]: 2025-12-05 13:16:43.343 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:16:46 np0005546954 nova_compute[187160]: 2025-12-05 13:16:46.732 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:16:48 np0005546954 nova_compute[187160]: 2025-12-05 13:16:48.345 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:16:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:16:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:16:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:16:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:16:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:16:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:16:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:16:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:16:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:16:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:16:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:16:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:16:51 np0005546954 nova_compute[187160]: 2025-12-05 13:16:51.736 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:16:52 np0005546954 podman[224177]: 2025-12-05 13:16:52.413020536 +0000 UTC m=+0.080451090 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, config_id=edpm, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec  5 08:16:52 np0005546954 podman[224178]: 2025-12-05 13:16:52.420676063 +0000 UTC m=+0.084072002 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec  5 08:16:53 np0005546954 nova_compute[187160]: 2025-12-05 13:16:53.349 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:16:56 np0005546954 nova_compute[187160]: 2025-12-05 13:16:56.738 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:16:56 np0005546954 systemd[1]: session-38.scope: Deactivated successfully.
Dec  5 08:16:56 np0005546954 systemd[1]: session-38.scope: Consumed 1min 14.990s CPU time, 489.2M memory peak, read 102.8M from disk, written 44.2M to disk.
Dec  5 08:16:56 np0005546954 systemd-logind[789]: Session 38 logged out. Waiting for processes to exit.
Dec  5 08:16:56 np0005546954 systemd-logind[789]: Removed session 38.
Dec  5 08:16:57 np0005546954 systemd-logind[789]: New session 39 of user zuul.
Dec  5 08:16:57 np0005546954 systemd[1]: Started Session 39 of User zuul.
Dec  5 08:16:57 np0005546954 systemd-logind[789]: Session 39 logged out. Waiting for processes to exit.
Dec  5 08:16:57 np0005546954 systemd[1]: session-39.scope: Deactivated successfully.
Dec  5 08:16:57 np0005546954 systemd-logind[789]: Removed session 39.
Dec  5 08:16:57 np0005546954 systemd-logind[789]: New session 40 of user zuul.
Dec  5 08:16:57 np0005546954 systemd[1]: Started Session 40 of User zuul.
Dec  5 08:16:58 np0005546954 systemd[1]: session-40.scope: Deactivated successfully.
Dec  5 08:16:58 np0005546954 systemd-logind[789]: Session 40 logged out. Waiting for processes to exit.
Dec  5 08:16:58 np0005546954 systemd-logind[789]: Removed session 40.
Dec  5 08:16:58 np0005546954 nova_compute[187160]: 2025-12-05 13:16:58.353 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:17:01 np0005546954 nova_compute[187160]: 2025-12-05 13:17:01.739 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:17:03 np0005546954 nova_compute[187160]: 2025-12-05 13:17:03.358 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:17:04 np0005546954 podman[224276]: 2025-12-05 13:17:04.566151021 +0000 UTC m=+0.063294046 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec  5 08:17:05 np0005546954 podman[197513]: time="2025-12-05T13:17:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:17:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:17:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 08:17:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:17:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2599 "" "Go-http-client/1.1"
Dec  5 08:17:06 np0005546954 nova_compute[187160]: 2025-12-05 13:17:06.741 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:17:07 np0005546954 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec  5 08:17:07 np0005546954 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec  5 08:17:08 np0005546954 nova_compute[187160]: 2025-12-05 13:17:08.361 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:17:11 np0005546954 nova_compute[187160]: 2025-12-05 13:17:11.743 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:17:12 np0005546954 podman[224301]: 2025-12-05 13:17:12.563416609 +0000 UTC m=+0.076587589 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  5 08:17:12 np0005546954 podman[224300]: 2025-12-05 13:17:12.644944282 +0000 UTC m=+0.153108587 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 08:17:13 np0005546954 nova_compute[187160]: 2025-12-05 13:17:13.363 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:17:16 np0005546954 nova_compute[187160]: 2025-12-05 13:17:16.745 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:17:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:17:16.983 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:17:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:17:16.984 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:17:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:17:16.985 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:17:18 np0005546954 nova_compute[187160]: 2025-12-05 13:17:18.366 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:17:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:17:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:17:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:17:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:17:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:17:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:17:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:17:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:17:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:17:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:17:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:17:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:17:21 np0005546954 nova_compute[187160]: 2025-12-05 13:17:21.041 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:17:21 np0005546954 nova_compute[187160]: 2025-12-05 13:17:21.042 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 08:17:21 np0005546954 nova_compute[187160]: 2025-12-05 13:17:21.043 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 08:17:21 np0005546954 nova_compute[187160]: 2025-12-05 13:17:21.746 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:17:22 np0005546954 podman[224349]: 2025-12-05 13:17:22.546869846 +0000 UTC m=+0.055721492 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec  5 08:17:22 np0005546954 podman[224348]: 2025-12-05 13:17:22.546919118 +0000 UTC m=+0.051761519 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.openshift.tags=minimal rhel9, release=1755695350, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, distribution-scope=public, version=9.6, container_name=openstack_network_exporter, architecture=x86_64, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec  5 08:17:23 np0005546954 nova_compute[187160]: 2025-12-05 13:17:23.369 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:17:23 np0005546954 nova_compute[187160]: 2025-12-05 13:17:23.781 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 08:17:23 np0005546954 nova_compute[187160]: 2025-12-05 13:17:23.782 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:17:23 np0005546954 nova_compute[187160]: 2025-12-05 13:17:23.782 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:17:24 np0005546954 nova_compute[187160]: 2025-12-05 13:17:24.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:17:25 np0005546954 nova_compute[187160]: 2025-12-05 13:17:25.034 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:17:26 np0005546954 nova_compute[187160]: 2025-12-05 13:17:26.747 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:17:28 np0005546954 nova_compute[187160]: 2025-12-05 13:17:28.373 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:17:29 np0005546954 nova_compute[187160]: 2025-12-05 13:17:29.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:17:30 np0005546954 nova_compute[187160]: 2025-12-05 13:17:30.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:17:31 np0005546954 nova_compute[187160]: 2025-12-05 13:17:31.095 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:17:31 np0005546954 nova_compute[187160]: 2025-12-05 13:17:31.095 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:17:31 np0005546954 nova_compute[187160]: 2025-12-05 13:17:31.095 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:17:31 np0005546954 nova_compute[187160]: 2025-12-05 13:17:31.096 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 08:17:31 np0005546954 nova_compute[187160]: 2025-12-05 13:17:31.236 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 08:17:31 np0005546954 nova_compute[187160]: 2025-12-05 13:17:31.238 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5782MB free_disk=73.32915496826172GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 08:17:31 np0005546954 nova_compute[187160]: 2025-12-05 13:17:31.238 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:17:31 np0005546954 nova_compute[187160]: 2025-12-05 13:17:31.238 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:17:31 np0005546954 nova_compute[187160]: 2025-12-05 13:17:31.748 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:17:33 np0005546954 nova_compute[187160]: 2025-12-05 13:17:33.377 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:17:33 np0005546954 nova_compute[187160]: 2025-12-05 13:17:33.663 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 08:17:33 np0005546954 nova_compute[187160]: 2025-12-05 13:17:33.664 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 08:17:33 np0005546954 nova_compute[187160]: 2025-12-05 13:17:33.688 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 08:17:34 np0005546954 nova_compute[187160]: 2025-12-05 13:17:34.203 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 08:17:34 np0005546954 nova_compute[187160]: 2025-12-05 13:17:34.205 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 08:17:34 np0005546954 nova_compute[187160]: 2025-12-05 13:17:34.205 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.967s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:17:35 np0005546954 podman[224387]: 2025-12-05 13:17:35.550174446 +0000 UTC m=+0.060662656 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  5 08:17:35 np0005546954 podman[197513]: time="2025-12-05T13:17:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:17:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:17:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 08:17:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:17:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2597 "" "Go-http-client/1.1"
Dec  5 08:17:36 np0005546954 nova_compute[187160]: 2025-12-05 13:17:36.820 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:17:38 np0005546954 nova_compute[187160]: 2025-12-05 13:17:38.206 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:17:38 np0005546954 nova_compute[187160]: 2025-12-05 13:17:38.224 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:17:38 np0005546954 nova_compute[187160]: 2025-12-05 13:17:38.224 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:17:38 np0005546954 nova_compute[187160]: 2025-12-05 13:17:38.224 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 08:17:38 np0005546954 nova_compute[187160]: 2025-12-05 13:17:38.381 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:17:41 np0005546954 nova_compute[187160]: 2025-12-05 13:17:41.821 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:17:43 np0005546954 nova_compute[187160]: 2025-12-05 13:17:43.385 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:17:43 np0005546954 podman[224407]: 2025-12-05 13:17:43.546220598 +0000 UTC m=+0.055743793 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 08:17:43 np0005546954 podman[224406]: 2025-12-05 13:17:43.572554816 +0000 UTC m=+0.089332616 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec  5 08:17:46 np0005546954 nova_compute[187160]: 2025-12-05 13:17:46.824 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:17:48 np0005546954 nova_compute[187160]: 2025-12-05 13:17:48.387 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:17:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:17:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:17:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:17:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:17:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:17:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:17:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:17:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:17:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:17:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:17:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:17:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:17:51 np0005546954 nova_compute[187160]: 2025-12-05 13:17:51.827 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:17:53 np0005546954 nova_compute[187160]: 2025-12-05 13:17:53.390 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:17:53 np0005546954 podman[224457]: 2025-12-05 13:17:53.544093623 +0000 UTC m=+0.047371662 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Dec  5 08:17:53 np0005546954 podman[224456]: 2025-12-05 13:17:53.555448836 +0000 UTC m=+0.050765728 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=9.6, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, maintainer=Red Hat, Inc., name=ubi9-minimal, architecture=x86_64, build-date=2025-08-20T13:12:41)
Dec  5 08:17:56 np0005546954 nova_compute[187160]: 2025-12-05 13:17:56.831 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:17:58 np0005546954 nova_compute[187160]: 2025-12-05 13:17:58.393 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:18:01 np0005546954 nova_compute[187160]: 2025-12-05 13:18:01.834 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:18:03 np0005546954 nova_compute[187160]: 2025-12-05 13:18:03.396 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:18:05 np0005546954 podman[197513]: time="2025-12-05T13:18:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:18:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:18:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 08:18:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:18:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2599 "" "Go-http-client/1.1"
Dec  5 08:18:06 np0005546954 podman[224497]: 2025-12-05 13:18:06.536077808 +0000 UTC m=+0.053058950 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Dec  5 08:18:06 np0005546954 nova_compute[187160]: 2025-12-05 13:18:06.837 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:18:08 np0005546954 nova_compute[187160]: 2025-12-05 13:18:08.399 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:18:11 np0005546954 nova_compute[187160]: 2025-12-05 13:18:11.839 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:18:13 np0005546954 nova_compute[187160]: 2025-12-05 13:18:13.403 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:18:14 np0005546954 podman[224518]: 2025-12-05 13:18:14.55702784 +0000 UTC m=+0.057765415 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  5 08:18:14 np0005546954 podman[224517]: 2025-12-05 13:18:14.601846432 +0000 UTC m=+0.098126369 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec  5 08:18:16 np0005546954 nova_compute[187160]: 2025-12-05 13:18:16.839 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:18:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:18:16.984 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:18:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:18:16.984 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:18:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:18:16.984 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:18:18 np0005546954 nova_compute[187160]: 2025-12-05 13:18:18.405 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:18:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:18:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:18:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:18:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:18:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:18:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:18:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:18:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:18:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:18:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:18:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:18:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:18:21 np0005546954 nova_compute[187160]: 2025-12-05 13:18:21.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:18:21 np0005546954 nova_compute[187160]: 2025-12-05 13:18:21.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 08:18:21 np0005546954 nova_compute[187160]: 2025-12-05 13:18:21.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 08:18:21 np0005546954 nova_compute[187160]: 2025-12-05 13:18:21.056 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 08:18:21 np0005546954 nova_compute[187160]: 2025-12-05 13:18:21.841 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:18:23 np0005546954 nova_compute[187160]: 2025-12-05 13:18:23.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:18:23 np0005546954 nova_compute[187160]: 2025-12-05 13:18:23.477 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:18:24 np0005546954 podman[224569]: 2025-12-05 13:18:24.53797605 +0000 UTC m=+0.047086803 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec  5 08:18:24 np0005546954 podman[224568]: 2025-12-05 13:18:24.588125856 +0000 UTC m=+0.096986631 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec  5 08:18:25 np0005546954 nova_compute[187160]: 2025-12-05 13:18:25.033 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:18:25 np0005546954 nova_compute[187160]: 2025-12-05 13:18:25.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:18:26 np0005546954 nova_compute[187160]: 2025-12-05 13:18:26.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:18:26 np0005546954 nova_compute[187160]: 2025-12-05 13:18:26.842 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:18:28 np0005546954 nova_compute[187160]: 2025-12-05 13:18:28.480 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:18:31 np0005546954 nova_compute[187160]: 2025-12-05 13:18:31.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:18:31 np0005546954 nova_compute[187160]: 2025-12-05 13:18:31.844 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:18:32 np0005546954 nova_compute[187160]: 2025-12-05 13:18:32.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:18:32 np0005546954 nova_compute[187160]: 2025-12-05 13:18:32.099 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:18:32 np0005546954 nova_compute[187160]: 2025-12-05 13:18:32.099 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:18:32 np0005546954 nova_compute[187160]: 2025-12-05 13:18:32.100 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:18:32 np0005546954 nova_compute[187160]: 2025-12-05 13:18:32.100 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 08:18:32 np0005546954 nova_compute[187160]: 2025-12-05 13:18:32.283 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 08:18:32 np0005546954 nova_compute[187160]: 2025-12-05 13:18:32.284 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5814MB free_disk=73.32915496826172GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 08:18:32 np0005546954 nova_compute[187160]: 2025-12-05 13:18:32.284 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:18:32 np0005546954 nova_compute[187160]: 2025-12-05 13:18:32.285 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:18:32 np0005546954 nova_compute[187160]: 2025-12-05 13:18:32.336 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 08:18:32 np0005546954 nova_compute[187160]: 2025-12-05 13:18:32.337 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 08:18:32 np0005546954 nova_compute[187160]: 2025-12-05 13:18:32.365 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 08:18:32 np0005546954 nova_compute[187160]: 2025-12-05 13:18:32.380 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 08:18:32 np0005546954 nova_compute[187160]: 2025-12-05 13:18:32.382 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 08:18:32 np0005546954 nova_compute[187160]: 2025-12-05 13:18:32.382 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.097s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:18:33 np0005546954 nova_compute[187160]: 2025-12-05 13:18:33.483 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:18:34 np0005546954 nova_compute[187160]: 2025-12-05 13:18:34.382 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:18:34 np0005546954 nova_compute[187160]: 2025-12-05 13:18:34.383 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 08:18:35 np0005546954 nova_compute[187160]: 2025-12-05 13:18:35.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:18:35 np0005546954 podman[197513]: time="2025-12-05T13:18:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:18:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:18:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 08:18:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:18:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2601 "" "Go-http-client/1.1"
Dec  5 08:18:36 np0005546954 nova_compute[187160]: 2025-12-05 13:18:36.847 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:18:37 np0005546954 podman[224611]: 2025-12-05 13:18:37.558343797 +0000 UTC m=+0.063424371 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Dec  5 08:18:38 np0005546954 nova_compute[187160]: 2025-12-05 13:18:38.486 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:18:41 np0005546954 nova_compute[187160]: 2025-12-05 13:18:41.847 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:18:43 np0005546954 nova_compute[187160]: 2025-12-05 13:18:43.490 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:18:45 np0005546954 podman[224633]: 2025-12-05 13:18:45.575905367 +0000 UTC m=+0.071871344 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 08:18:45 np0005546954 podman[224632]: 2025-12-05 13:18:45.639770291 +0000 UTC m=+0.141520498 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 08:18:46 np0005546954 nova_compute[187160]: 2025-12-05 13:18:46.849 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:18:48 np0005546954 nova_compute[187160]: 2025-12-05 13:18:48.493 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:18:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:18:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:18:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:18:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:18:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:18:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:18:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:18:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:18:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:18:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:18:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:18:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:18:51 np0005546954 nova_compute[187160]: 2025-12-05 13:18:51.851 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:18:53 np0005546954 nova_compute[187160]: 2025-12-05 13:18:53.496 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:18:55 np0005546954 podman[224678]: 2025-12-05 13:18:55.529035094 +0000 UTC m=+0.048505660 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=openstack_network_exporter, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., distribution-scope=public, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, release=1755695350, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.buildah.version=1.33.7, name=ubi9-minimal)
Dec  5 08:18:55 np0005546954 podman[224679]: 2025-12-05 13:18:55.535897328 +0000 UTC m=+0.051943408 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.build-date=20251125, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec  5 08:18:56 np0005546954 nova_compute[187160]: 2025-12-05 13:18:56.854 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:18:58 np0005546954 nova_compute[187160]: 2025-12-05 13:18:58.499 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:19:01 np0005546954 nova_compute[187160]: 2025-12-05 13:19:01.897 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:19:03 np0005546954 nova_compute[187160]: 2025-12-05 13:19:03.502 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:19:05 np0005546954 podman[197513]: time="2025-12-05T13:19:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:19:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:19:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 08:19:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:19:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2596 "" "Go-http-client/1.1"
Dec  5 08:19:06 np0005546954 nova_compute[187160]: 2025-12-05 13:19:06.899 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:19:08 np0005546954 nova_compute[187160]: 2025-12-05 13:19:08.506 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:19:08 np0005546954 podman[224724]: 2025-12-05 13:19:08.590685689 +0000 UTC m=+0.093074827 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 08:19:11 np0005546954 nova_compute[187160]: 2025-12-05 13:19:11.960 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:19:13 np0005546954 nova_compute[187160]: 2025-12-05 13:19:13.509 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:19:16 np0005546954 podman[224744]: 2025-12-05 13:19:16.563336711 +0000 UTC m=+0.067563753 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 08:19:16 np0005546954 podman[224743]: 2025-12-05 13:19:16.611518801 +0000 UTC m=+0.114849505 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec  5 08:19:16 np0005546954 nova_compute[187160]: 2025-12-05 13:19:16.961 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:19:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:19:16.985 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:19:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:19:16.987 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:19:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:19:16.988 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:19:18 np0005546954 nova_compute[187160]: 2025-12-05 13:19:18.512 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:19:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:19:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:19:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:19:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:19:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:19:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:19:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:19:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:19:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:19:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:19:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:19:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:19:21 np0005546954 nova_compute[187160]: 2025-12-05 13:19:21.962 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:19:22 np0005546954 nova_compute[187160]: 2025-12-05 13:19:22.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:19:22 np0005546954 nova_compute[187160]: 2025-12-05 13:19:22.039 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 08:19:22 np0005546954 nova_compute[187160]: 2025-12-05 13:19:22.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 08:19:22 np0005546954 nova_compute[187160]: 2025-12-05 13:19:22.459 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 08:19:23 np0005546954 nova_compute[187160]: 2025-12-05 13:19:23.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:19:23 np0005546954 nova_compute[187160]: 2025-12-05 13:19:23.515 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:19:25 np0005546954 nova_compute[187160]: 2025-12-05 13:19:25.034 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:19:26 np0005546954 nova_compute[187160]: 2025-12-05 13:19:26.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:19:26 np0005546954 nova_compute[187160]: 2025-12-05 13:19:26.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:19:26 np0005546954 podman[224793]: 2025-12-05 13:19:26.55724743 +0000 UTC m=+0.069336528 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350)
Dec  5 08:19:26 np0005546954 podman[224794]: 2025-12-05 13:19:26.566953522 +0000 UTC m=+0.070036250 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  5 08:19:26 np0005546954 nova_compute[187160]: 2025-12-05 13:19:26.963 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:19:28 np0005546954 nova_compute[187160]: 2025-12-05 13:19:28.519 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:19:31 np0005546954 nova_compute[187160]: 2025-12-05 13:19:31.966 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:19:32 np0005546954 nova_compute[187160]: 2025-12-05 13:19:32.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:19:32 np0005546954 nova_compute[187160]: 2025-12-05 13:19:32.339 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:19:32 np0005546954 nova_compute[187160]: 2025-12-05 13:19:32.340 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:19:32 np0005546954 nova_compute[187160]: 2025-12-05 13:19:32.340 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:19:32 np0005546954 nova_compute[187160]: 2025-12-05 13:19:32.340 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 08:19:32 np0005546954 nova_compute[187160]: 2025-12-05 13:19:32.536 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 08:19:32 np0005546954 nova_compute[187160]: 2025-12-05 13:19:32.538 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5816MB free_disk=73.32915496826172GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 08:19:32 np0005546954 nova_compute[187160]: 2025-12-05 13:19:32.539 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:19:32 np0005546954 nova_compute[187160]: 2025-12-05 13:19:32.540 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:19:33 np0005546954 nova_compute[187160]: 2025-12-05 13:19:33.138 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 08:19:33 np0005546954 nova_compute[187160]: 2025-12-05 13:19:33.139 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 08:19:33 np0005546954 nova_compute[187160]: 2025-12-05 13:19:33.174 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Refreshing inventories for resource provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  5 08:19:33 np0005546954 nova_compute[187160]: 2025-12-05 13:19:33.195 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Updating ProviderTree inventory for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  5 08:19:33 np0005546954 nova_compute[187160]: 2025-12-05 13:19:33.195 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Updating inventory in ProviderTree for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  5 08:19:33 np0005546954 nova_compute[187160]: 2025-12-05 13:19:33.214 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Refreshing aggregate associations for resource provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  5 08:19:33 np0005546954 nova_compute[187160]: 2025-12-05 13:19:33.241 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Refreshing trait associations for resource provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b, traits: COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_IDE,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_2_0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  5 08:19:33 np0005546954 nova_compute[187160]: 2025-12-05 13:19:33.265 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 08:19:33 np0005546954 nova_compute[187160]: 2025-12-05 13:19:33.290 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 08:19:33 np0005546954 nova_compute[187160]: 2025-12-05 13:19:33.291 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 08:19:33 np0005546954 nova_compute[187160]: 2025-12-05 13:19:33.291 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.752s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:19:33 np0005546954 nova_compute[187160]: 2025-12-05 13:19:33.522 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:19:34 np0005546954 nova_compute[187160]: 2025-12-05 13:19:34.292 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:19:34 np0005546954 nova_compute[187160]: 2025-12-05 13:19:34.293 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:19:34 np0005546954 nova_compute[187160]: 2025-12-05 13:19:34.293 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 08:19:35 np0005546954 nova_compute[187160]: 2025-12-05 13:19:35.041 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:19:35 np0005546954 podman[197513]: time="2025-12-05T13:19:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:19:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:19:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 08:19:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:19:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2597 "" "Go-http-client/1.1"
Dec  5 08:19:36 np0005546954 nova_compute[187160]: 2025-12-05 13:19:36.967 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:19:38 np0005546954 nova_compute[187160]: 2025-12-05 13:19:38.525 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:19:39 np0005546954 podman[224831]: 2025-12-05 13:19:39.572766337 +0000 UTC m=+0.076755840 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  5 08:19:41 np0005546954 nova_compute[187160]: 2025-12-05 13:19:41.969 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:19:42 np0005546954 nova_compute[187160]: 2025-12-05 13:19:42.034 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:19:43 np0005546954 nova_compute[187160]: 2025-12-05 13:19:43.528 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:19:46 np0005546954 nova_compute[187160]: 2025-12-05 13:19:46.971 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:19:47 np0005546954 podman[224850]: 2025-12-05 13:19:47.592117923 +0000 UTC m=+0.092316394 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  5 08:19:47 np0005546954 podman[224849]: 2025-12-05 13:19:47.635257564 +0000 UTC m=+0.130701887 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec  5 08:19:48 np0005546954 nova_compute[187160]: 2025-12-05 13:19:48.531 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:19:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:19:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:19:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:19:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:19:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:19:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:19:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:19:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:19:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:19:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:19:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:19:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:19:51 np0005546954 nova_compute[187160]: 2025-12-05 13:19:51.974 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:19:53 np0005546954 nova_compute[187160]: 2025-12-05 13:19:53.535 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:19:56 np0005546954 nova_compute[187160]: 2025-12-05 13:19:56.976 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:19:57 np0005546954 podman[224900]: 2025-12-05 13:19:57.566783881 +0000 UTC m=+0.073302292 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec  5 08:19:57 np0005546954 podman[224899]: 2025-12-05 13:19:57.578402551 +0000 UTC m=+0.084864220 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, version=9.6, io.buildah.version=1.33.7, vendor=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec  5 08:19:58 np0005546954 nova_compute[187160]: 2025-12-05 13:19:58.538 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:20:01 np0005546954 nova_compute[187160]: 2025-12-05 13:20:01.978 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:20:03 np0005546954 nova_compute[187160]: 2025-12-05 13:20:03.541 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:20:05 np0005546954 podman[197513]: time="2025-12-05T13:20:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:20:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:20:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 08:20:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:20:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2599 "" "Go-http-client/1.1"
Dec  5 08:20:06 np0005546954 nova_compute[187160]: 2025-12-05 13:20:06.980 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:20:08 np0005546954 nova_compute[187160]: 2025-12-05 13:20:08.544 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:20:10 np0005546954 podman[224939]: 2025-12-05 13:20:10.543387328 +0000 UTC m=+0.057975374 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 08:20:11 np0005546954 nova_compute[187160]: 2025-12-05 13:20:11.982 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:20:13 np0005546954 nova_compute[187160]: 2025-12-05 13:20:13.547 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:20:16 np0005546954 nova_compute[187160]: 2025-12-05 13:20:16.983 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:20:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:20:16.985 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:20:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:20:16.986 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:20:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:20:16.986 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:20:18 np0005546954 nova_compute[187160]: 2025-12-05 13:20:18.551 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:20:18 np0005546954 podman[224958]: 2025-12-05 13:20:18.584518961 +0000 UTC m=+0.088014789 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true)
Dec  5 08:20:18 np0005546954 podman[224959]: 2025-12-05 13:20:18.609213239 +0000 UTC m=+0.098031871 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  5 08:20:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:20:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:20:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:20:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:20:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:20:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:20:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:20:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:20:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:20:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:20:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:20:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:20:21 np0005546954 nova_compute[187160]: 2025-12-05 13:20:21.985 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:20:23 np0005546954 nova_compute[187160]: 2025-12-05 13:20:23.554 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:20:24 np0005546954 nova_compute[187160]: 2025-12-05 13:20:24.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:20:24 np0005546954 nova_compute[187160]: 2025-12-05 13:20:24.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 08:20:24 np0005546954 nova_compute[187160]: 2025-12-05 13:20:24.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 08:20:26 np0005546954 nova_compute[187160]: 2025-12-05 13:20:26.987 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:20:27 np0005546954 nova_compute[187160]: 2025-12-05 13:20:27.712 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 08:20:27 np0005546954 nova_compute[187160]: 2025-12-05 13:20:27.713 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:20:27 np0005546954 nova_compute[187160]: 2025-12-05 13:20:27.713 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:20:28 np0005546954 nova_compute[187160]: 2025-12-05 13:20:28.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:20:28 np0005546954 nova_compute[187160]: 2025-12-05 13:20:28.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:20:28 np0005546954 nova_compute[187160]: 2025-12-05 13:20:28.558 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:20:28 np0005546954 podman[225008]: 2025-12-05 13:20:28.56146381 +0000 UTC m=+0.071083422 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., name=ubi9-minimal, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, architecture=x86_64, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_id=edpm, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container)
Dec  5 08:20:28 np0005546954 podman[225009]: 2025-12-05 13:20:28.579907144 +0000 UTC m=+0.086313326 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, tcib_managed=true)
Dec  5 08:20:31 np0005546954 nova_compute[187160]: 2025-12-05 13:20:31.989 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:20:33 np0005546954 nova_compute[187160]: 2025-12-05 13:20:33.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:20:33 np0005546954 nova_compute[187160]: 2025-12-05 13:20:33.562 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:20:33 np0005546954 nova_compute[187160]: 2025-12-05 13:20:33.708 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:20:33 np0005546954 nova_compute[187160]: 2025-12-05 13:20:33.709 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:20:33 np0005546954 nova_compute[187160]: 2025-12-05 13:20:33.709 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:20:33 np0005546954 nova_compute[187160]: 2025-12-05 13:20:33.709 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 08:20:33 np0005546954 nova_compute[187160]: 2025-12-05 13:20:33.842 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 08:20:33 np0005546954 nova_compute[187160]: 2025-12-05 13:20:33.843 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5831MB free_disk=73.33554458618164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 08:20:33 np0005546954 nova_compute[187160]: 2025-12-05 13:20:33.843 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:20:33 np0005546954 nova_compute[187160]: 2025-12-05 13:20:33.843 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:20:34 np0005546954 nova_compute[187160]: 2025-12-05 13:20:34.084 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 08:20:34 np0005546954 nova_compute[187160]: 2025-12-05 13:20:34.085 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 08:20:34 np0005546954 nova_compute[187160]: 2025-12-05 13:20:34.154 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 08:20:34 np0005546954 nova_compute[187160]: 2025-12-05 13:20:34.326 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 08:20:34 np0005546954 nova_compute[187160]: 2025-12-05 13:20:34.328 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 08:20:34 np0005546954 nova_compute[187160]: 2025-12-05 13:20:34.328 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.485s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:20:34 np0005546954 nova_compute[187160]: 2025-12-05 13:20:34.329 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:20:35 np0005546954 podman[197513]: time="2025-12-05T13:20:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:20:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:20:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 08:20:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:20:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2591 "" "Go-http-client/1.1"
Dec  5 08:20:36 np0005546954 nova_compute[187160]: 2025-12-05 13:20:36.055 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:20:36 np0005546954 nova_compute[187160]: 2025-12-05 13:20:36.056 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:20:36 np0005546954 nova_compute[187160]: 2025-12-05 13:20:36.056 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:20:36 np0005546954 nova_compute[187160]: 2025-12-05 13:20:36.056 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 08:20:36 np0005546954 nova_compute[187160]: 2025-12-05 13:20:36.057 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:20:36 np0005546954 nova_compute[187160]: 2025-12-05 13:20:36.057 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  5 08:20:36 np0005546954 nova_compute[187160]: 2025-12-05 13:20:36.500 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  5 08:20:36 np0005546954 nova_compute[187160]: 2025-12-05 13:20:36.991 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:20:38 np0005546954 nova_compute[187160]: 2025-12-05 13:20:38.565 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:20:41 np0005546954 podman[225049]: 2025-12-05 13:20:41.543865749 +0000 UTC m=+0.054488556 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec  5 08:20:41 np0005546954 nova_compute[187160]: 2025-12-05 13:20:41.993 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:20:43 np0005546954 nova_compute[187160]: 2025-12-05 13:20:43.569 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:20:46 np0005546954 nova_compute[187160]: 2025-12-05 13:20:46.995 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:20:48 np0005546954 nova_compute[187160]: 2025-12-05 13:20:48.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:20:48 np0005546954 nova_compute[187160]: 2025-12-05 13:20:48.041 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  5 08:20:48 np0005546954 nova_compute[187160]: 2025-12-05 13:20:48.575 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:20:49 np0005546954 podman[225069]: 2025-12-05 13:20:49.061271177 +0000 UTC m=+0.069719980 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  5 08:20:49 np0005546954 podman[225068]: 2025-12-05 13:20:49.161953 +0000 UTC m=+0.170404723 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec  5 08:20:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:20:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:20:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:20:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:20:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:20:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:20:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:20:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:20:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:20:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:20:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:20:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:20:51 np0005546954 nova_compute[187160]: 2025-12-05 13:20:51.997 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:20:53 np0005546954 nova_compute[187160]: 2025-12-05 13:20:53.578 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:20:56 np0005546954 nova_compute[187160]: 2025-12-05 13:20:56.998 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:20:58 np0005546954 nova_compute[187160]: 2025-12-05 13:20:58.582 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:20:59 np0005546954 podman[225117]: 2025-12-05 13:20:59.55802925 +0000 UTC m=+0.070481003 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, managed_by=edpm_ansible, version=9.6, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.openshift.expose-services=, release=1755695350, com.redhat.component=ubi9-minimal-container)
Dec  5 08:20:59 np0005546954 podman[225118]: 2025-12-05 13:20:59.577268199 +0000 UTC m=+0.087633528 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  5 08:21:02 np0005546954 nova_compute[187160]: 2025-12-05 13:21:02.000 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:21:03 np0005546954 nova_compute[187160]: 2025-12-05 13:21:03.584 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:21:05 np0005546954 podman[197513]: time="2025-12-05T13:21:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:21:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:21:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 08:21:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:21:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2596 "" "Go-http-client/1.1"
Dec  5 08:21:07 np0005546954 nova_compute[187160]: 2025-12-05 13:21:07.001 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:21:08 np0005546954 nova_compute[187160]: 2025-12-05 13:21:08.587 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:21:12 np0005546954 nova_compute[187160]: 2025-12-05 13:21:12.003 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:21:12 np0005546954 nova_compute[187160]: 2025-12-05 13:21:12.454 187164 DEBUG oslo_concurrency.processutils [None req-df3a9858-b12b-40c1-bf12-a91a1fbd2f0c 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 08:21:12 np0005546954 nova_compute[187160]: 2025-12-05 13:21:12.492 187164 DEBUG oslo_concurrency.processutils [None req-df3a9858-b12b-40c1-bf12-a91a1fbd2f0c 9ad2c12994dc4e638d5f9a363e07727e 83916c53de6f404f91206339303e1b23 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 08:21:12 np0005546954 podman[225159]: 2025-12-05 13:21:12.879615994 +0000 UTC m=+0.103824002 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125)
Dec  5 08:21:13 np0005546954 nova_compute[187160]: 2025-12-05 13:21:13.590 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:21:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:21:16.987 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:21:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:21:16.988 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:21:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:21:16.989 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:21:17 np0005546954 nova_compute[187160]: 2025-12-05 13:21:17.004 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:21:18 np0005546954 nova_compute[187160]: 2025-12-05 13:21:18.594 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:21:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:21:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:21:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:21:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:21:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:21:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:21:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:21:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:21:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:21:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:21:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:21:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:21:19 np0005546954 podman[225179]: 2025-12-05 13:21:19.549796507 +0000 UTC m=+0.061657809 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec  5 08:21:19 np0005546954 podman[225178]: 2025-12-05 13:21:19.636845946 +0000 UTC m=+0.153518998 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  5 08:21:19 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:21:19.717 104428 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2a:56:46', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '32:90:88:ab:74:32'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 08:21:19 np0005546954 nova_compute[187160]: 2025-12-05 13:21:19.718 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:21:19 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:21:19.719 104428 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 08:21:22 np0005546954 nova_compute[187160]: 2025-12-05 13:21:22.007 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:21:23 np0005546954 nova_compute[187160]: 2025-12-05 13:21:23.598 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:21:25 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:21:25.722 104428 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f9f74c-08f9-451f-9678-93bb9e8fa2fe, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 08:21:26 np0005546954 nova_compute[187160]: 2025-12-05 13:21:26.196 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:21:26 np0005546954 nova_compute[187160]: 2025-12-05 13:21:26.197 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 08:21:26 np0005546954 nova_compute[187160]: 2025-12-05 13:21:26.197 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 08:21:26 np0005546954 nova_compute[187160]: 2025-12-05 13:21:26.873 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 08:21:27 np0005546954 nova_compute[187160]: 2025-12-05 13:21:27.008 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:21:27 np0005546954 nova_compute[187160]: 2025-12-05 13:21:27.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:21:27 np0005546954 nova_compute[187160]: 2025-12-05 13:21:27.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:21:28 np0005546954 nova_compute[187160]: 2025-12-05 13:21:28.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:21:28 np0005546954 nova_compute[187160]: 2025-12-05 13:21:28.655 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:21:29 np0005546954 nova_compute[187160]: 2025-12-05 13:21:29.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:21:30 np0005546954 podman[225228]: 2025-12-05 13:21:30.5674586 +0000 UTC m=+0.072672392 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible)
Dec  5 08:21:30 np0005546954 podman[225227]: 2025-12-05 13:21:30.586465232 +0000 UTC m=+0.087446132 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, architecture=x86_64, container_name=openstack_network_exporter)
Dec  5 08:21:32 np0005546954 nova_compute[187160]: 2025-12-05 13:21:32.010 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:21:33 np0005546954 nova_compute[187160]: 2025-12-05 13:21:33.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:21:33 np0005546954 nova_compute[187160]: 2025-12-05 13:21:33.154 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:21:33 np0005546954 nova_compute[187160]: 2025-12-05 13:21:33.155 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:21:33 np0005546954 nova_compute[187160]: 2025-12-05 13:21:33.155 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:21:33 np0005546954 nova_compute[187160]: 2025-12-05 13:21:33.155 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 08:21:33 np0005546954 nova_compute[187160]: 2025-12-05 13:21:33.334 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 08:21:33 np0005546954 nova_compute[187160]: 2025-12-05 13:21:33.335 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5826MB free_disk=73.33341598510742GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 08:21:33 np0005546954 nova_compute[187160]: 2025-12-05 13:21:33.336 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:21:33 np0005546954 nova_compute[187160]: 2025-12-05 13:21:33.336 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:21:33 np0005546954 nova_compute[187160]: 2025-12-05 13:21:33.429 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 08:21:33 np0005546954 nova_compute[187160]: 2025-12-05 13:21:33.430 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 08:21:33 np0005546954 nova_compute[187160]: 2025-12-05 13:21:33.460 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 08:21:33 np0005546954 nova_compute[187160]: 2025-12-05 13:21:33.477 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 08:21:33 np0005546954 nova_compute[187160]: 2025-12-05 13:21:33.478 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 08:21:33 np0005546954 nova_compute[187160]: 2025-12-05 13:21:33.478 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:21:33 np0005546954 nova_compute[187160]: 2025-12-05 13:21:33.658 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:21:35 np0005546954 nova_compute[187160]: 2025-12-05 13:21:35.479 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:21:35 np0005546954 nova_compute[187160]: 2025-12-05 13:21:35.479 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:21:35 np0005546954 nova_compute[187160]: 2025-12-05 13:21:35.480 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 08:21:35 np0005546954 podman[197513]: time="2025-12-05T13:21:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:21:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:21:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 08:21:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:21:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2594 "" "Go-http-client/1.1"
Dec  5 08:21:36 np0005546954 nova_compute[187160]: 2025-12-05 13:21:36.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:21:37 np0005546954 nova_compute[187160]: 2025-12-05 13:21:37.012 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:21:38 np0005546954 nova_compute[187160]: 2025-12-05 13:21:38.660 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:21:42 np0005546954 nova_compute[187160]: 2025-12-05 13:21:42.015 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:21:43 np0005546954 podman[225266]: 2025-12-05 13:21:43.524962395 +0000 UTC m=+0.042842824 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 08:21:43 np0005546954 nova_compute[187160]: 2025-12-05 13:21:43.663 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:21:47 np0005546954 nova_compute[187160]: 2025-12-05 13:21:47.017 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:21:47 np0005546954 nova_compute[187160]: 2025-12-05 13:21:47.035 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:21:48 np0005546954 nova_compute[187160]: 2025-12-05 13:21:48.666 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:21:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:21:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:21:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:21:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:21:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:21:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:21:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:21:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:21:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:21:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:21:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:21:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:21:50 np0005546954 podman[225288]: 2025-12-05 13:21:50.562444538 +0000 UTC m=+0.068619297 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 08:21:50 np0005546954 podman[225287]: 2025-12-05 13:21:50.577196667 +0000 UTC m=+0.096173094 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 08:21:52 np0005546954 nova_compute[187160]: 2025-12-05 13:21:52.020 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:21:53 np0005546954 nova_compute[187160]: 2025-12-05 13:21:53.669 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:21:57 np0005546954 nova_compute[187160]: 2025-12-05 13:21:57.019 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:21:58 np0005546954 nova_compute[187160]: 2025-12-05 13:21:58.671 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:22:01 np0005546954 podman[225335]: 2025-12-05 13:22:01.540972133 +0000 UTC m=+0.056765507 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, distribution-scope=public, config_id=edpm, vcs-type=git, release=1755695350, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.)
Dec  5 08:22:01 np0005546954 podman[225336]: 2025-12-05 13:22:01.54568326 +0000 UTC m=+0.058039317 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec  5 08:22:02 np0005546954 nova_compute[187160]: 2025-12-05 13:22:02.021 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:22:03 np0005546954 nova_compute[187160]: 2025-12-05 13:22:03.675 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:22:05 np0005546954 podman[197513]: time="2025-12-05T13:22:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:22:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:22:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 08:22:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:22:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2597 "" "Go-http-client/1.1"
Dec  5 08:22:07 np0005546954 nova_compute[187160]: 2025-12-05 13:22:07.023 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:22:08 np0005546954 nova_compute[187160]: 2025-12-05 13:22:08.679 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:22:12 np0005546954 nova_compute[187160]: 2025-12-05 13:22:12.028 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:22:13 np0005546954 nova_compute[187160]: 2025-12-05 13:22:13.682 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:22:14 np0005546954 podman[225380]: 2025-12-05 13:22:14.530926606 +0000 UTC m=+0.046024532 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  5 08:22:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:22:16.988 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:22:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:22:16.988 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:22:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:22:16.988 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:22:17 np0005546954 nova_compute[187160]: 2025-12-05 13:22:17.028 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:22:18 np0005546954 nova_compute[187160]: 2025-12-05 13:22:18.686 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:22:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:22:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:22:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:22:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:22:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:22:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:22:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:22:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:22:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:22:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:22:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:22:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:22:21 np0005546954 podman[225401]: 2025-12-05 13:22:21.539995906 +0000 UTC m=+0.055130857 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 08:22:21 np0005546954 podman[225400]: 2025-12-05 13:22:21.615048431 +0000 UTC m=+0.132792773 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec  5 08:22:22 np0005546954 nova_compute[187160]: 2025-12-05 13:22:22.030 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:22:23 np0005546954 nova_compute[187160]: 2025-12-05 13:22:23.689 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:22:27 np0005546954 nova_compute[187160]: 2025-12-05 13:22:27.068 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:22:27 np0005546954 nova_compute[187160]: 2025-12-05 13:22:27.078 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:22:28 np0005546954 nova_compute[187160]: 2025-12-05 13:22:28.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:22:28 np0005546954 nova_compute[187160]: 2025-12-05 13:22:28.039 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 08:22:28 np0005546954 nova_compute[187160]: 2025-12-05 13:22:28.039 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 08:22:28 np0005546954 nova_compute[187160]: 2025-12-05 13:22:28.059 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 08:22:28 np0005546954 nova_compute[187160]: 2025-12-05 13:22:28.693 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:22:29 np0005546954 nova_compute[187160]: 2025-12-05 13:22:29.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:22:29 np0005546954 nova_compute[187160]: 2025-12-05 13:22:29.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:22:30 np0005546954 nova_compute[187160]: 2025-12-05 13:22:30.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:22:32 np0005546954 nova_compute[187160]: 2025-12-05 13:22:32.080 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:22:32 np0005546954 podman[225447]: 2025-12-05 13:22:32.588520808 +0000 UTC m=+0.080271649 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, maintainer=Red Hat, Inc., config_id=edpm, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, version=9.6, container_name=openstack_network_exporter)
Dec  5 08:22:32 np0005546954 podman[225448]: 2025-12-05 13:22:32.605271029 +0000 UTC m=+0.094041217 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec  5 08:22:33 np0005546954 nova_compute[187160]: 2025-12-05 13:22:33.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:22:33 np0005546954 nova_compute[187160]: 2025-12-05 13:22:33.067 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:22:33 np0005546954 nova_compute[187160]: 2025-12-05 13:22:33.068 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:22:33 np0005546954 nova_compute[187160]: 2025-12-05 13:22:33.068 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:22:33 np0005546954 nova_compute[187160]: 2025-12-05 13:22:33.068 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 08:22:33 np0005546954 nova_compute[187160]: 2025-12-05 13:22:33.323 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 08:22:33 np0005546954 nova_compute[187160]: 2025-12-05 13:22:33.324 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5839MB free_disk=73.3333969116211GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 08:22:33 np0005546954 nova_compute[187160]: 2025-12-05 13:22:33.324 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:22:33 np0005546954 nova_compute[187160]: 2025-12-05 13:22:33.325 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:22:33 np0005546954 nova_compute[187160]: 2025-12-05 13:22:33.400 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 08:22:33 np0005546954 nova_compute[187160]: 2025-12-05 13:22:33.401 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 08:22:33 np0005546954 nova_compute[187160]: 2025-12-05 13:22:33.506 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 08:22:33 np0005546954 nova_compute[187160]: 2025-12-05 13:22:33.530 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 08:22:33 np0005546954 nova_compute[187160]: 2025-12-05 13:22:33.532 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 08:22:33 np0005546954 nova_compute[187160]: 2025-12-05 13:22:33.532 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.207s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:22:33 np0005546954 nova_compute[187160]: 2025-12-05 13:22:33.696 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:22:35 np0005546954 podman[197513]: time="2025-12-05T13:22:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:22:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:22:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 08:22:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:22:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2592 "" "Go-http-client/1.1"
Dec  5 08:22:36 np0005546954 nova_compute[187160]: 2025-12-05 13:22:36.532 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:22:36 np0005546954 nova_compute[187160]: 2025-12-05 13:22:36.533 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 08:22:37 np0005546954 nova_compute[187160]: 2025-12-05 13:22:37.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:22:37 np0005546954 nova_compute[187160]: 2025-12-05 13:22:37.041 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:22:37 np0005546954 nova_compute[187160]: 2025-12-05 13:22:37.084 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:22:38 np0005546954 nova_compute[187160]: 2025-12-05 13:22:38.700 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:22:42 np0005546954 nova_compute[187160]: 2025-12-05 13:22:42.085 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:22:43 np0005546954 nova_compute[187160]: 2025-12-05 13:22:43.702 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:22:45 np0005546954 podman[225488]: 2025-12-05 13:22:45.536866878 +0000 UTC m=+0.050122881 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 08:22:47 np0005546954 nova_compute[187160]: 2025-12-05 13:22:47.087 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:22:48 np0005546954 nova_compute[187160]: 2025-12-05 13:22:48.746 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:22:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:22:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:22:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:22:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:22:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:22:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:22:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:22:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:22:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:22:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:22:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:22:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:22:52 np0005546954 nova_compute[187160]: 2025-12-05 13:22:52.088 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:22:52 np0005546954 podman[225510]: 2025-12-05 13:22:52.540218239 +0000 UTC m=+0.051345599 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec  5 08:22:52 np0005546954 podman[225509]: 2025-12-05 13:22:52.569994525 +0000 UTC m=+0.083065036 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec  5 08:22:53 np0005546954 nova_compute[187160]: 2025-12-05 13:22:53.749 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:22:57 np0005546954 nova_compute[187160]: 2025-12-05 13:22:57.092 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:22:58 np0005546954 nova_compute[187160]: 2025-12-05 13:22:58.753 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:23:02 np0005546954 nova_compute[187160]: 2025-12-05 13:23:02.093 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:23:03 np0005546954 podman[225559]: 2025-12-05 13:23:03.540970445 +0000 UTC m=+0.056758927 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, distribution-scope=public, io.buildah.version=1.33.7, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, architecture=x86_64, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., release=1755695350, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible)
Dec  5 08:23:03 np0005546954 podman[225560]: 2025-12-05 13:23:03.548417287 +0000 UTC m=+0.059059568 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec  5 08:23:03 np0005546954 nova_compute[187160]: 2025-12-05 13:23:03.757 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:23:05 np0005546954 podman[197513]: time="2025-12-05T13:23:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:23:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:23:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 08:23:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:23:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2595 "" "Go-http-client/1.1"
Dec  5 08:23:07 np0005546954 nova_compute[187160]: 2025-12-05 13:23:07.095 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:23:08 np0005546954 nova_compute[187160]: 2025-12-05 13:23:08.761 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:23:12 np0005546954 nova_compute[187160]: 2025-12-05 13:23:12.096 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:23:13 np0005546954 nova_compute[187160]: 2025-12-05 13:23:13.763 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:23:16 np0005546954 podman[225598]: 2025-12-05 13:23:16.563691429 +0000 UTC m=+0.069807133 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 08:23:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:23:16.989 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:23:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:23:16.989 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:23:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:23:16.989 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:23:17 np0005546954 nova_compute[187160]: 2025-12-05 13:23:17.098 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:23:18 np0005546954 nova_compute[187160]: 2025-12-05 13:23:18.766 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:23:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:23:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:23:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:23:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:23:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:23:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:23:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:23:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:23:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:23:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:23:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:23:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:23:22 np0005546954 nova_compute[187160]: 2025-12-05 13:23:22.099 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:23:23 np0005546954 podman[225620]: 2025-12-05 13:23:23.554056665 +0000 UTC m=+0.054077393 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 08:23:23 np0005546954 podman[225619]: 2025-12-05 13:23:23.583729919 +0000 UTC m=+0.092574512 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  5 08:23:23 np0005546954 nova_compute[187160]: 2025-12-05 13:23:23.791 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:23:27 np0005546954 nova_compute[187160]: 2025-12-05 13:23:27.100 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:23:28 np0005546954 nova_compute[187160]: 2025-12-05 13:23:28.035 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:23:28 np0005546954 nova_compute[187160]: 2025-12-05 13:23:28.794 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:23:30 np0005546954 nova_compute[187160]: 2025-12-05 13:23:30.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:23:30 np0005546954 nova_compute[187160]: 2025-12-05 13:23:30.039 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 08:23:30 np0005546954 nova_compute[187160]: 2025-12-05 13:23:30.039 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 08:23:30 np0005546954 nova_compute[187160]: 2025-12-05 13:23:30.063 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 08:23:30 np0005546954 nova_compute[187160]: 2025-12-05 13:23:30.064 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:23:31 np0005546954 nova_compute[187160]: 2025-12-05 13:23:31.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:23:31 np0005546954 nova_compute[187160]: 2025-12-05 13:23:31.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:23:32 np0005546954 nova_compute[187160]: 2025-12-05 13:23:32.103 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:23:33 np0005546954 nova_compute[187160]: 2025-12-05 13:23:33.798 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:23:34 np0005546954 nova_compute[187160]: 2025-12-05 13:23:34.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:23:34 np0005546954 nova_compute[187160]: 2025-12-05 13:23:34.190 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:23:34 np0005546954 nova_compute[187160]: 2025-12-05 13:23:34.191 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:23:34 np0005546954 nova_compute[187160]: 2025-12-05 13:23:34.191 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:23:34 np0005546954 nova_compute[187160]: 2025-12-05 13:23:34.191 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 08:23:34 np0005546954 nova_compute[187160]: 2025-12-05 13:23:34.420 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 08:23:34 np0005546954 nova_compute[187160]: 2025-12-05 13:23:34.421 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5848MB free_disk=73.33336639404297GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 08:23:34 np0005546954 nova_compute[187160]: 2025-12-05 13:23:34.422 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:23:34 np0005546954 nova_compute[187160]: 2025-12-05 13:23:34.422 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:23:34 np0005546954 podman[225670]: 2025-12-05 13:23:34.594470536 +0000 UTC m=+0.090759816 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, name=ubi9-minimal, io.buildah.version=1.33.7, release=1755695350, vcs-type=git, architecture=x86_64, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_id=edpm, version=9.6, distribution-scope=public)
Dec  5 08:23:34 np0005546954 nova_compute[187160]: 2025-12-05 13:23:34.597 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 08:23:34 np0005546954 nova_compute[187160]: 2025-12-05 13:23:34.597 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 08:23:34 np0005546954 nova_compute[187160]: 2025-12-05 13:23:34.621 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 08:23:34 np0005546954 podman[225671]: 2025-12-05 13:23:34.622005963 +0000 UTC m=+0.113649497 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd)
Dec  5 08:23:34 np0005546954 nova_compute[187160]: 2025-12-05 13:23:34.776 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 08:23:34 np0005546954 nova_compute[187160]: 2025-12-05 13:23:34.778 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 08:23:34 np0005546954 nova_compute[187160]: 2025-12-05 13:23:34.779 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.356s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:23:35 np0005546954 podman[197513]: time="2025-12-05T13:23:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:23:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:23:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 08:23:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:23:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2602 "" "Go-http-client/1.1"
Dec  5 08:23:37 np0005546954 nova_compute[187160]: 2025-12-05 13:23:37.105 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:23:37 np0005546954 nova_compute[187160]: 2025-12-05 13:23:37.785 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:23:37 np0005546954 nova_compute[187160]: 2025-12-05 13:23:37.788 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:23:37 np0005546954 nova_compute[187160]: 2025-12-05 13:23:37.789 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 08:23:38 np0005546954 nova_compute[187160]: 2025-12-05 13:23:38.801 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:23:39 np0005546954 nova_compute[187160]: 2025-12-05 13:23:39.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:23:42 np0005546954 nova_compute[187160]: 2025-12-05 13:23:42.108 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:23:43 np0005546954 nova_compute[187160]: 2025-12-05 13:23:43.804 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:23:47 np0005546954 nova_compute[187160]: 2025-12-05 13:23:47.110 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:23:47 np0005546954 podman[225713]: 2025-12-05 13:23:47.530805911 +0000 UTC m=+0.042373599 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  5 08:23:48 np0005546954 nova_compute[187160]: 2025-12-05 13:23:48.808 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:23:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:23:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:23:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:23:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:23:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:23:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:23:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:23:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:23:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:23:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:23:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:23:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:23:52 np0005546954 nova_compute[187160]: 2025-12-05 13:23:52.034 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:23:52 np0005546954 nova_compute[187160]: 2025-12-05 13:23:52.110 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:23:53 np0005546954 nova_compute[187160]: 2025-12-05 13:23:53.811 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:23:54 np0005546954 podman[225734]: 2025-12-05 13:23:54.567004154 +0000 UTC m=+0.084514181 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible)
Dec  5 08:23:54 np0005546954 podman[225735]: 2025-12-05 13:23:54.567415717 +0000 UTC m=+0.077016998 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 08:23:57 np0005546954 nova_compute[187160]: 2025-12-05 13:23:57.111 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:23:58 np0005546954 nova_compute[187160]: 2025-12-05 13:23:58.815 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:24:02 np0005546954 nova_compute[187160]: 2025-12-05 13:24:02.113 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:24:03 np0005546954 nova_compute[187160]: 2025-12-05 13:24:03.817 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:24:05 np0005546954 podman[225786]: 2025-12-05 13:24:05.557071628 +0000 UTC m=+0.065483519 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec  5 08:24:05 np0005546954 podman[225785]: 2025-12-05 13:24:05.557219943 +0000 UTC m=+0.068606155 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git)
Dec  5 08:24:05 np0005546954 podman[197513]: time="2025-12-05T13:24:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:24:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:24:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 08:24:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:24:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2601 "" "Go-http-client/1.1"
Dec  5 08:24:07 np0005546954 nova_compute[187160]: 2025-12-05 13:24:07.117 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:24:08 np0005546954 nova_compute[187160]: 2025-12-05 13:24:08.820 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:24:12 np0005546954 nova_compute[187160]: 2025-12-05 13:24:12.119 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:24:13 np0005546954 nova_compute[187160]: 2025-12-05 13:24:13.823 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:24:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:24:16.990 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:24:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:24:16.992 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:24:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:24:16.992 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:24:17 np0005546954 nova_compute[187160]: 2025-12-05 13:24:17.161 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:24:18 np0005546954 podman[225823]: 2025-12-05 13:24:18.552563385 +0000 UTC m=+0.067147040 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  5 08:24:18 np0005546954 nova_compute[187160]: 2025-12-05 13:24:18.827 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:24:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:24:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:24:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:24:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:24:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:24:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:24:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:24:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:24:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:24:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:24:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:24:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:24:22 np0005546954 nova_compute[187160]: 2025-12-05 13:24:22.163 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:24:23 np0005546954 nova_compute[187160]: 2025-12-05 13:24:23.833 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:24:25 np0005546954 podman[225847]: 2025-12-05 13:24:25.54996764 +0000 UTC m=+0.059304216 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 08:24:25 np0005546954 podman[225846]: 2025-12-05 13:24:25.569762236 +0000 UTC m=+0.083502629 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Dec  5 08:24:27 np0005546954 nova_compute[187160]: 2025-12-05 13:24:27.164 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:24:28 np0005546954 nova_compute[187160]: 2025-12-05 13:24:28.118 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:24:28 np0005546954 nova_compute[187160]: 2025-12-05 13:24:28.835 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:24:30 np0005546954 nova_compute[187160]: 2025-12-05 13:24:30.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:24:30 np0005546954 nova_compute[187160]: 2025-12-05 13:24:30.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 08:24:30 np0005546954 nova_compute[187160]: 2025-12-05 13:24:30.040 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 08:24:30 np0005546954 nova_compute[187160]: 2025-12-05 13:24:30.057 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 08:24:30 np0005546954 nova_compute[187160]: 2025-12-05 13:24:30.058 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:24:31 np0005546954 nova_compute[187160]: 2025-12-05 13:24:31.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:24:32 np0005546954 nova_compute[187160]: 2025-12-05 13:24:32.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:24:32 np0005546954 nova_compute[187160]: 2025-12-05 13:24:32.166 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:24:33 np0005546954 nova_compute[187160]: 2025-12-05 13:24:33.837 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:24:35 np0005546954 podman[197513]: time="2025-12-05T13:24:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:24:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:24:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 08:24:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:24:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2596 "" "Go-http-client/1.1"
Dec  5 08:24:36 np0005546954 nova_compute[187160]: 2025-12-05 13:24:36.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:24:36 np0005546954 nova_compute[187160]: 2025-12-05 13:24:36.069 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:24:36 np0005546954 nova_compute[187160]: 2025-12-05 13:24:36.070 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:24:36 np0005546954 nova_compute[187160]: 2025-12-05 13:24:36.070 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:24:36 np0005546954 nova_compute[187160]: 2025-12-05 13:24:36.070 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 08:24:36 np0005546954 nova_compute[187160]: 2025-12-05 13:24:36.209 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 08:24:36 np0005546954 nova_compute[187160]: 2025-12-05 13:24:36.210 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5838MB free_disk=73.33336639404297GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 08:24:36 np0005546954 nova_compute[187160]: 2025-12-05 13:24:36.211 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:24:36 np0005546954 nova_compute[187160]: 2025-12-05 13:24:36.211 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:24:36 np0005546954 nova_compute[187160]: 2025-12-05 13:24:36.288 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 08:24:36 np0005546954 nova_compute[187160]: 2025-12-05 13:24:36.288 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 08:24:36 np0005546954 nova_compute[187160]: 2025-12-05 13:24:36.313 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Refreshing inventories for resource provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  5 08:24:36 np0005546954 nova_compute[187160]: 2025-12-05 13:24:36.351 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Updating ProviderTree inventory for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  5 08:24:36 np0005546954 nova_compute[187160]: 2025-12-05 13:24:36.352 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Updating inventory in ProviderTree for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  5 08:24:36 np0005546954 nova_compute[187160]: 2025-12-05 13:24:36.386 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Refreshing aggregate associations for resource provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  5 08:24:36 np0005546954 nova_compute[187160]: 2025-12-05 13:24:36.431 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Refreshing trait associations for resource provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b, traits: COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_IDE,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_2_0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  5 08:24:36 np0005546954 nova_compute[187160]: 2025-12-05 13:24:36.459 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 08:24:36 np0005546954 nova_compute[187160]: 2025-12-05 13:24:36.478 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 08:24:36 np0005546954 nova_compute[187160]: 2025-12-05 13:24:36.481 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 08:24:36 np0005546954 nova_compute[187160]: 2025-12-05 13:24:36.481 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.270s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:24:36 np0005546954 podman[225895]: 2025-12-05 13:24:36.54183441 +0000 UTC m=+0.047994133 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec  5 08:24:36 np0005546954 podman[225894]: 2025-12-05 13:24:36.553199434 +0000 UTC m=+0.057840360 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, name=ubi9-minimal, version=9.6, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., container_name=openstack_network_exporter, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_id=edpm)
Dec  5 08:24:37 np0005546954 nova_compute[187160]: 2025-12-05 13:24:37.168 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:24:37 np0005546954 nova_compute[187160]: 2025-12-05 13:24:37.482 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:24:37 np0005546954 nova_compute[187160]: 2025-12-05 13:24:37.483 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 08:24:38 np0005546954 nova_compute[187160]: 2025-12-05 13:24:38.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:24:38 np0005546954 nova_compute[187160]: 2025-12-05 13:24:38.840 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:24:41 np0005546954 nova_compute[187160]: 2025-12-05 13:24:41.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:24:42 np0005546954 nova_compute[187160]: 2025-12-05 13:24:42.171 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:24:43 np0005546954 nova_compute[187160]: 2025-12-05 13:24:43.844 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:24:47 np0005546954 nova_compute[187160]: 2025-12-05 13:24:47.174 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:24:48 np0005546954 nova_compute[187160]: 2025-12-05 13:24:48.847 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:24:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:24:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:24:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:24:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:24:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:24:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:24:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:24:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:24:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:24:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:24:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:24:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:24:49 np0005546954 podman[225932]: 2025-12-05 13:24:49.551795207 +0000 UTC m=+0.068319717 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  5 08:24:52 np0005546954 nova_compute[187160]: 2025-12-05 13:24:52.177 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:24:53 np0005546954 nova_compute[187160]: 2025-12-05 13:24:53.850 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:24:56 np0005546954 podman[225954]: 2025-12-05 13:24:56.557969738 +0000 UTC m=+0.061227016 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 08:24:56 np0005546954 podman[225953]: 2025-12-05 13:24:56.610941886 +0000 UTC m=+0.116025401 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  5 08:24:57 np0005546954 nova_compute[187160]: 2025-12-05 13:24:57.177 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:24:58 np0005546954 nova_compute[187160]: 2025-12-05 13:24:58.853 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:25:02 np0005546954 nova_compute[187160]: 2025-12-05 13:25:02.179 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:25:03 np0005546954 nova_compute[187160]: 2025-12-05 13:25:03.856 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:25:05 np0005546954 podman[197513]: time="2025-12-05T13:25:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:25:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:25:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 08:25:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:25:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2594 "" "Go-http-client/1.1"
Dec  5 08:25:07 np0005546954 nova_compute[187160]: 2025-12-05 13:25:07.183 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:25:07 np0005546954 podman[226003]: 2025-12-05 13:25:07.539125952 +0000 UTC m=+0.049713497 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.expose-services=, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, name=ubi9-minimal, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, version=9.6, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.buildah.version=1.33.7)
Dec  5 08:25:07 np0005546954 podman[226004]: 2025-12-05 13:25:07.56633247 +0000 UTC m=+0.073543590 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=multipathd)
Dec  5 08:25:08 np0005546954 nova_compute[187160]: 2025-12-05 13:25:08.859 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:25:12 np0005546954 nova_compute[187160]: 2025-12-05 13:25:12.185 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:25:13 np0005546954 nova_compute[187160]: 2025-12-05 13:25:13.862 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:25:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:25:16.992 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:25:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:25:16.994 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:25:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:25:16.994 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:25:17 np0005546954 nova_compute[187160]: 2025-12-05 13:25:17.188 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:25:18 np0005546954 nova_compute[187160]: 2025-12-05 13:25:18.864 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:25:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:25:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:25:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:25:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:25:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:25:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:25:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:25:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:25:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:25:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:25:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:25:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:25:20 np0005546954 podman[226039]: 2025-12-05 13:25:20.572188048 +0000 UTC m=+0.082499498 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Dec  5 08:25:22 np0005546954 nova_compute[187160]: 2025-12-05 13:25:22.188 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:25:23 np0005546954 nova_compute[187160]: 2025-12-05 13:25:23.867 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:25:27 np0005546954 nova_compute[187160]: 2025-12-05 13:25:27.191 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:25:27 np0005546954 podman[226060]: 2025-12-05 13:25:27.532960543 +0000 UTC m=+0.043454002 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  5 08:25:27 np0005546954 podman[226059]: 2025-12-05 13:25:27.590185014 +0000 UTC m=+0.103277414 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  5 08:25:28 np0005546954 nova_compute[187160]: 2025-12-05 13:25:28.870 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:25:30 np0005546954 nova_compute[187160]: 2025-12-05 13:25:30.034 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:25:31 np0005546954 nova_compute[187160]: 2025-12-05 13:25:31.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:25:32 np0005546954 nova_compute[187160]: 2025-12-05 13:25:32.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:25:32 np0005546954 nova_compute[187160]: 2025-12-05 13:25:32.039 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 08:25:32 np0005546954 nova_compute[187160]: 2025-12-05 13:25:32.039 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 08:25:32 np0005546954 nova_compute[187160]: 2025-12-05 13:25:32.053 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 08:25:32 np0005546954 nova_compute[187160]: 2025-12-05 13:25:32.054 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:25:32 np0005546954 nova_compute[187160]: 2025-12-05 13:25:32.192 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:25:33 np0005546954 nova_compute[187160]: 2025-12-05 13:25:33.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:25:33 np0005546954 nova_compute[187160]: 2025-12-05 13:25:33.873 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:25:34 np0005546954 nova_compute[187160]: 2025-12-05 13:25:34.051 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:25:35 np0005546954 podman[197513]: time="2025-12-05T13:25:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:25:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:25:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 08:25:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:25:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2600 "" "Go-http-client/1.1"
Dec  5 08:25:37 np0005546954 nova_compute[187160]: 2025-12-05 13:25:37.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:25:37 np0005546954 nova_compute[187160]: 2025-12-05 13:25:37.039 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 08:25:37 np0005546954 nova_compute[187160]: 2025-12-05 13:25:37.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:25:37 np0005546954 nova_compute[187160]: 2025-12-05 13:25:37.069 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:25:37 np0005546954 nova_compute[187160]: 2025-12-05 13:25:37.070 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:25:37 np0005546954 nova_compute[187160]: 2025-12-05 13:25:37.070 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:25:37 np0005546954 nova_compute[187160]: 2025-12-05 13:25:37.070 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 08:25:37 np0005546954 nova_compute[187160]: 2025-12-05 13:25:37.195 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:25:37 np0005546954 nova_compute[187160]: 2025-12-05 13:25:37.222 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 08:25:37 np0005546954 nova_compute[187160]: 2025-12-05 13:25:37.223 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5830MB free_disk=73.33336639404297GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 08:25:37 np0005546954 nova_compute[187160]: 2025-12-05 13:25:37.223 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:25:37 np0005546954 nova_compute[187160]: 2025-12-05 13:25:37.223 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:25:37 np0005546954 nova_compute[187160]: 2025-12-05 13:25:37.291 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 08:25:37 np0005546954 nova_compute[187160]: 2025-12-05 13:25:37.292 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 08:25:37 np0005546954 nova_compute[187160]: 2025-12-05 13:25:37.315 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 08:25:37 np0005546954 nova_compute[187160]: 2025-12-05 13:25:37.328 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 08:25:37 np0005546954 nova_compute[187160]: 2025-12-05 13:25:37.330 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 08:25:37 np0005546954 nova_compute[187160]: 2025-12-05 13:25:37.332 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:25:38 np0005546954 podman[226105]: 2025-12-05 13:25:38.53754033 +0000 UTC m=+0.053214467 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, config_id=edpm, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, name=ubi9-minimal, distribution-scope=public, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec  5 08:25:38 np0005546954 podman[226106]: 2025-12-05 13:25:38.566384916 +0000 UTC m=+0.067696176 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd)
Dec  5 08:25:38 np0005546954 nova_compute[187160]: 2025-12-05 13:25:38.877 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:25:40 np0005546954 nova_compute[187160]: 2025-12-05 13:25:40.334 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:25:42 np0005546954 nova_compute[187160]: 2025-12-05 13:25:42.195 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:25:43 np0005546954 nova_compute[187160]: 2025-12-05 13:25:43.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:25:43 np0005546954 nova_compute[187160]: 2025-12-05 13:25:43.880 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:25:45 np0005546954 nova_compute[187160]: 2025-12-05 13:25:45.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:25:45 np0005546954 nova_compute[187160]: 2025-12-05 13:25:45.039 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  5 08:25:45 np0005546954 nova_compute[187160]: 2025-12-05 13:25:45.337 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  5 08:25:47 np0005546954 nova_compute[187160]: 2025-12-05 13:25:47.196 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:25:48 np0005546954 nova_compute[187160]: 2025-12-05 13:25:48.883 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:25:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:25:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:25:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:25:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:25:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:25:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:25:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:25:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:25:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:25:49 np0005546954 openstack_network_exporter[199661]: ERROR   13:25:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:25:49 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:25:51 np0005546954 podman[226150]: 2025-12-05 13:25:51.540844029 +0000 UTC m=+0.055698114 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 08:25:52 np0005546954 nova_compute[187160]: 2025-12-05 13:25:52.198 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:25:53 np0005546954 nova_compute[187160]: 2025-12-05 13:25:53.887 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:25:55 np0005546954 nova_compute[187160]: 2025-12-05 13:25:55.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:25:55 np0005546954 nova_compute[187160]: 2025-12-05 13:25:55.039 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  5 08:25:57 np0005546954 nova_compute[187160]: 2025-12-05 13:25:57.116 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:25:57 np0005546954 nova_compute[187160]: 2025-12-05 13:25:57.262 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:25:58 np0005546954 podman[226171]: 2025-12-05 13:25:58.53651604 +0000 UTC m=+0.050612055 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  5 08:25:58 np0005546954 podman[226170]: 2025-12-05 13:25:58.564106839 +0000 UTC m=+0.081590130 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Dec  5 08:25:58 np0005546954 nova_compute[187160]: 2025-12-05 13:25:58.899 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:26:00 np0005546954 nova_compute[187160]: 2025-12-05 13:26:00.841 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:26:02 np0005546954 nova_compute[187160]: 2025-12-05 13:26:02.264 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:26:03 np0005546954 nova_compute[187160]: 2025-12-05 13:26:03.904 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:26:05 np0005546954 podman[197513]: time="2025-12-05T13:26:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:26:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:26:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 08:26:05 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:26:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2603 "" "Go-http-client/1.1"
Dec  5 08:26:07 np0005546954 nova_compute[187160]: 2025-12-05 13:26:07.267 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:26:08 np0005546954 nova_compute[187160]: 2025-12-05 13:26:08.910 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:26:09 np0005546954 podman[226218]: 2025-12-05 13:26:09.545923917 +0000 UTC m=+0.056230671 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 08:26:09 np0005546954 podman[226217]: 2025-12-05 13:26:09.552387927 +0000 UTC m=+0.059672957 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, container_name=openstack_network_exporter, release=1755695350, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container)
Dec  5 08:26:12 np0005546954 nova_compute[187160]: 2025-12-05 13:26:12.270 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:26:13 np0005546954 nova_compute[187160]: 2025-12-05 13:26:13.915 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:26:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:26:16.993 104428 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:26:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:26:16.994 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:26:16 np0005546954 ovn_metadata_agent[104423]: 2025-12-05 13:26:16.994 104428 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:26:17 np0005546954 nova_compute[187160]: 2025-12-05 13:26:17.272 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:26:18 np0005546954 nova_compute[187160]: 2025-12-05 13:26:18.919 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:26:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:26:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:26:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:26:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  5 08:26:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:26:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  5 08:26:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:26:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  5 08:26:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:26:19 np0005546954 openstack_network_exporter[199661]: ERROR   13:26:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  5 08:26:19 np0005546954 openstack_network_exporter[199661]: 
Dec  5 08:26:22 np0005546954 nova_compute[187160]: 2025-12-05 13:26:22.274 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:26:22 np0005546954 podman[226259]: 2025-12-05 13:26:22.54743701 +0000 UTC m=+0.057445858 container health_status cd0596ba876921a91539a9d42bf9ae676001ebf0ea08173952f5d8c5adc6e806 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Dec  5 08:26:23 np0005546954 nova_compute[187160]: 2025-12-05 13:26:23.922 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:26:27 np0005546954 nova_compute[187160]: 2025-12-05 13:26:27.275 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:26:28 np0005546954 nova_compute[187160]: 2025-12-05 13:26:28.925 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:26:29 np0005546954 podman[226279]: 2025-12-05 13:26:29.535038552 +0000 UTC m=+0.045993602 container health_status 53f8dbdf498913c7a57d0ca595911f077ee245bab717b380a9dc2f183d859cf3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 08:26:29 np0005546954 podman[226278]: 2025-12-05 13:26:29.551845965 +0000 UTC m=+0.067427248 container health_status 0e7d0470f799296b555c18fb3d923e812950f4afe483e03cf03d50949a7593dc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 08:26:32 np0005546954 nova_compute[187160]: 2025-12-05 13:26:32.243 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:26:32 np0005546954 nova_compute[187160]: 2025-12-05 13:26:32.243 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:26:32 np0005546954 nova_compute[187160]: 2025-12-05 13:26:32.243 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 08:26:32 np0005546954 nova_compute[187160]: 2025-12-05 13:26:32.243 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 08:26:32 np0005546954 nova_compute[187160]: 2025-12-05 13:26:32.278 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:26:32 np0005546954 nova_compute[187160]: 2025-12-05 13:26:32.536 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 08:26:33 np0005546954 nova_compute[187160]: 2025-12-05 13:26:33.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:26:33 np0005546954 nova_compute[187160]: 2025-12-05 13:26:33.930 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:26:34 np0005546954 nova_compute[187160]: 2025-12-05 13:26:34.040 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:26:34 np0005546954 systemd-logind[789]: New session 41 of user zuul.
Dec  5 08:26:34 np0005546954 systemd[1]: Started Session 41 of User zuul.
Dec  5 08:26:35 np0005546954 podman[197513]: time="2025-12-05T13:26:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 08:26:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:26:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  5 08:26:35 np0005546954 podman[197513]: @ - - [05/Dec/2025:13:26:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2597 "" "Go-http-client/1.1"
Dec  5 08:26:36 np0005546954 nova_compute[187160]: 2025-12-05 13:26:36.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:26:37 np0005546954 nova_compute[187160]: 2025-12-05 13:26:37.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:26:37 np0005546954 nova_compute[187160]: 2025-12-05 13:26:37.041 187164 DEBUG nova.compute.manager [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 08:26:37 np0005546954 nova_compute[187160]: 2025-12-05 13:26:37.280 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:26:38 np0005546954 nova_compute[187160]: 2025-12-05 13:26:38.038 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:26:38 np0005546954 nova_compute[187160]: 2025-12-05 13:26:38.586 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:26:38 np0005546954 nova_compute[187160]: 2025-12-05 13:26:38.587 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:26:38 np0005546954 nova_compute[187160]: 2025-12-05 13:26:38.587 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:26:38 np0005546954 nova_compute[187160]: 2025-12-05 13:26:38.587 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 08:26:38 np0005546954 nova_compute[187160]: 2025-12-05 13:26:38.720 187164 WARNING nova.virt.libvirt.driver [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 08:26:38 np0005546954 nova_compute[187160]: 2025-12-05 13:26:38.721 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5751MB free_disk=73.33309555053711GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 08:26:38 np0005546954 nova_compute[187160]: 2025-12-05 13:26:38.722 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 08:26:38 np0005546954 nova_compute[187160]: 2025-12-05 13:26:38.722 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 08:26:38 np0005546954 nova_compute[187160]: 2025-12-05 13:26:38.933 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:26:38 np0005546954 ovs-vsctl[226506]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Dec  5 08:26:39 np0005546954 nova_compute[187160]: 2025-12-05 13:26:39.593 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 08:26:39 np0005546954 nova_compute[187160]: 2025-12-05 13:26:39.594 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 08:26:39 np0005546954 nova_compute[187160]: 2025-12-05 13:26:39.624 187164 DEBUG nova.compute.provider_tree [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed in ProviderTree for provider: eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 08:26:39 np0005546954 virtqemud[186730]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Dec  5 08:26:39 np0005546954 virtqemud[186730]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Dec  5 08:26:39 np0005546954 virtqemud[186730]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec  5 08:26:40 np0005546954 nova_compute[187160]: 2025-12-05 13:26:40.018 187164 DEBUG nova.scheduler.client.report [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Inventory has not changed for provider eaf4a6cb-3fbe-4f49-8f5e-462e5382b15b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 08:26:40 np0005546954 nova_compute[187160]: 2025-12-05 13:26:40.019 187164 DEBUG nova.compute.resource_tracker [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 08:26:40 np0005546954 nova_compute[187160]: 2025-12-05 13:26:40.019 187164 DEBUG oslo_concurrency.lockutils [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.297s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 08:26:40 np0005546954 podman[226717]: 2025-12-05 13:26:40.123087615 +0000 UTC m=+0.060917946 container health_status 02badb51617be118048bb5f17b77014e5e45edef298c88822dd2a0168b70c601 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, managed_by=edpm_ansible, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., distribution-scope=public, release=1755695350, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal)
Dec  5 08:26:40 np0005546954 podman[226723]: 2025-12-05 13:26:40.129852966 +0000 UTC m=+0.068151572 container health_status bfccab39ab5112a6527afd2cc23bd08224ae237347c17a8c806edbc35a38006d (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3)
Dec  5 08:26:42 np0005546954 nova_compute[187160]: 2025-12-05 13:26:42.281 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:26:43 np0005546954 systemd[1]: Starting Hostname Service...
Dec  5 08:26:43 np0005546954 systemd[1]: Started Hostname Service.
Dec  5 08:26:43 np0005546954 nova_compute[187160]: 2025-12-05 13:26:43.936 187164 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 08:26:44 np0005546954 nova_compute[187160]: 2025-12-05 13:26:44.020 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 08:26:45 np0005546954 nova_compute[187160]: 2025-12-05 13:26:45.039 187164 DEBUG oslo_service.periodic_task [None req-5e6b9f2f-0a23-4d41-bdf0-7e4c335ff697 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
